Article thumbnail
Location of Repository

The quantization of the attention function under a Bayes information theoretic model



Bayes experimental design using entropy, or equivalently negative information, as a criterion is fairly well developed. The present work applies this model but at a primitive level in statistical sampling. It is assumed that the observer/experimentor is allowed to place a window over the support of a sampling distribution and only "pay for" observations that fall in this window. The window can be modeled with an "attention function", simply the indicator function of the window. The understanding is that the cost of the experiment is only the number of paid for observations: n. For fixed n and under the information model it turns out that for standard problems the optimal structure for the window, in the limit amongst all types of window including disjoint regions, is discrete. That is to say it is optimal to observe the world (in this sense) through discrete slits. It also shows that in this case Bayesians with different priors will receive different samples because typically the optimal attention windows will be disjoint. This property we refer to as the quantisation of the attention function

Topics: QC
OAI identifier:
Sorry, our data provider has not provided any external links therefore we are unable to provide a link to the full text.

Suggested articles

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.