39 research outputs found
Algorithmic Bayesian Epistemology
One aspect of the algorithmic lens in theoretical computer science is a view
on other scientific disciplines that focuses on satisfactory solutions that
adhere to real-world constraints, as opposed to solutions that would be optimal
ignoring such constraints. The algorithmic lens has provided a unique and
important perspective on many academic fields, including molecular biology,
ecology, neuroscience, quantum physics, economics, and social science.
This thesis applies the algorithmic lens to Bayesian epistemology.
Traditional Bayesian epistemology provides a comprehensive framework for how an
individual's beliefs should evolve upon receiving new information. However,
these methods typically assume an exhaustive model of such information,
including the correlation structure between different pieces of evidence. In
reality, individuals might lack such an exhaustive model, while still needing
to form beliefs. Beyond such informational constraints, an individual may be
bounded by limited computation, or by limited communication with agents that
have access to information, or by the strategic behavior of such agents. Even
when these restrictions prevent the formation of a *perfectly* accurate belief,
arriving at a *reasonably* accurate belief remains crucial. In this thesis, we
establish fundamental possibility and impossibility results about belief
formation under a variety of restrictions, and lay the groundwork for further
exploration.Comment: 385 pages, PhD thesis, 14 figures, 4 table
Recommended from our members
Algorithmic Bayesian Epistemology
One aspect of the algorithmic lens in theoretical computer science is a view on other scientific disciplines that focuses on satisfactory solutions that adhere to real-world constraints, as opposed to solutions that would be optimal ignoring such constraints. The algorithmic lens has provided a unique and important perspective on many academic fields, including molecular biology, ecology, neuroscience, quantum physics, economics, and social science.
This thesis applies the algorithmic lens to Bayesian epistemology. Traditional Bayesian epistemology provides a comprehensive framework for how an individual's beliefs should evolve upon receiving new information. However, these methods typically assume an exhaustive model of such information, including the correlation structure between different pieces of evidence. In reality, individuals might lack such an exhaustive model, while still needing to form beliefs. Beyond such informational constraints, an individual may be bounded by limited computation, or by limited communication with agents that have access to information, or by the strategic behavior of such agents. Even when these restrictions prevent the formation of a *perfectly* accurate belief, arriving at a *reasonably* accurate belief remains crucial. In this thesis, we establish fundamental possibility and impossibility results about belief formation under a variety of restrictions, and lay the groundwork for further exploration
On Orders of Elliptic Curves over Finite Fields
In this work, we completely characterize by -invariant the number of orders of elliptic curves over all finite fields using combinatorial arguments and elementary number theory. Whenever possible, we state and prove exactly which orders can be taken on
Purification and Characterization of Trehalase From Acyrthosiphon pisum, a Target for Pest Control.
Insect trehalases are glycoside hydrolases essential for trehalose metabolism and stress resistance. We here report the extraction and purification of Acyrthosiphon pisum soluble trehalase (ApTreh-1), its biochemical and structural characterization, as well as the determination of its kinetic properties. The protein has been purified by ammonium sulphate precipitation, first followed by an anion-exchange and then by an affinity chromatography. The SDS-PAGE shows a main band at 70 kDa containing two isoforms of ApTreh-1 (X1 and X2), identified by mass spectrometry and slightly contrasting in the C-terminal region. A phylogenetic tree, a multiple sequence alignment, as well as a modelled 3D-structure were constructed and they all reveal the ApTreh-1 similarity to other insect trehalases, i.e. the two signature motifs (179)PGGRFRELYYWDTY(192) and (479)QWDFPNAWPP(489), a glycine-rich region (549)GGGGEY(554), and the catalytic residues Asp336 and Glu538. The optimum enzyme activity occurs at 45 °C and pH 5.0, with K(m) and V(max) values of ~ 71 mM and ~ 126 µmol/min/mg, respectively. The present structural and functional characterization of soluble A. pisum trehalase enters the development of new strategies to control the aphids pest without significant risk for non-target organisms and human health
Visibility science operations with the Keck Interferometer
The visibility science mode of the Keck Interferometer fully transitioned into operations with the successful completion of its operational readiness review in April 2004. The goal of this paper is to describe this science mode and the operations structure that supports it
Computing the smallest fixed point of order-preserving nonexpansive mappings arising in positive stochastic games and static analysis of programs
The problem of computing the smallest fixed point of an order-preserving map
arises in the study of zero-sum positive stochastic games. It also arises in
static analysis of programs by abstract interpretation. In this context, the
discount rate may be negative. We characterize the minimality of a fixed point
in terms of the nonlinear spectral radius of a certain semidifferential. We
apply this characterization to design a policy iteration algorithm, which
applies to the case of finite state and action spaces. The algorithm returns a
locally minimal fixed point, which turns out to be globally minimal when the
discount rate is nonnegative.Comment: 26 pages, 3 figures. We add new results, improvements and two
examples of positive stochastic games. Note that an initial version of the
paper has appeared in the proceedings of the Eighteenth International
Symposium on Mathematical Theory of Networks and Systems (MTNS2008),
Blacksburg, Virginia, July 200
The fallacy of placing confidence in confidence intervals
Interval estimates – estimates of parameters that include an allowance for sampling uncertainty – have long been touted as a key component of statistical analyses. There are several kinds of interval estimates, but the most popular are confidence intervals (CIs): intervals that contain the true parameter value in some known proportion of repeated samples, on average. The width of confidence intervals is thought to index the precision of an estimate; CIs are thought to be a guide to which parameter values are plausible or reasonable; and the confidence coefficient of the interval (e.g., 95 %) is thought to index the plausibility that the true parameter is included in the interval. We show in a number of examples that CIs do not necessarily have any of these properties, and can lead to unjustified or arbitrary inferences. For this reason, we caution against relying upon confidence interval theory to justify interval estimates, and suggest that other theories of interval estimation should be used instead