15,359 research outputs found
Recommended from our members
Prior Learning and Gibbs Reaction-Diffusion
This article addresses two important themes in early visual computation: it presents a novel theory for learning the universal statistics of natural images, and, it proposes a general framework of designing reaction-diffusion equations for image processing. We studied the statistics of natural images including the scale invariant properties, then generic prior models were learned to duplicate the observed statistics, based on minimax entropy theory. The resulting Gibbs distributions have potentials of the form U(I; Λ, S)=Σα=1kΣx,yλ (α)((F(α)*I)(x,y)) with S={F(1) , F(2),...,F(K)} being a set of filters and Λ={λ(1)(),λ(2)(),...,λ (K)()} the potential functions. The learned Gibbs distributions confirm and improve the form of existing prior models such as line-process, but, in contrast to all previous models, inverted potentials were found to be necessary. We find that the partial differential equations given by gradient descent on U(I; Λ, S) are essentially reaction-diffusion equations, where the usual energy terms produce anisotropic diffusion, while the inverted energy terms produce reaction associated with pattern formation, enhancing preferred image features. We illustrate how these models can be used for texture pattern rendering, denoising, image enhancement, and clutter removal by careful choice of both prior and data models of this type, incorporating the appropriate featuresMathematic
A Tutorial on Bayesian Nonparametric Models
A key problem in statistical modeling is model selection, how to choose a
model at an appropriate level of complexity. This problem appears in many
settings, most prominently in choosing the number ofclusters in mixture models
or the number of factors in factor analysis. In this tutorial we describe
Bayesian nonparametric methods, a class of methods that side-steps this issue
by allowing the data to determine the complexity of the model. This tutorial is
a high-level introduction to Bayesian nonparametric methods and contains
several examples of their application.Comment: 28 pages, 8 figure
NEXUS/Physics: An interdisciplinary repurposing of physics for biologists
In response to increasing calls for the reform of the undergraduate science
curriculum for life science majors and pre-medical students (Bio2010,
Scientific Foundations for Future Physicians, Vision & Change), an
interdisciplinary team has created NEXUS/Physics: a repurposing of an
introductory physics curriculum for the life sciences. The curriculum interacts
strongly and supportively with introductory biology and chemistry courses taken
by life sciences students, with the goal of helping students build general,
multi-discipline scientific competencies. In order to do this, our two-semester
NEXUS/Physics course sequence is positioned as a second year course so students
will have had some exposure to basic concepts in biology and chemistry.
NEXUS/Physics stresses interdisciplinary examples and the content differs
markedly from traditional introductory physics to facilitate this. It extends
the discussion of energy to include interatomic potentials and chemical
reactions, the discussion of thermodynamics to include enthalpy and Gibbs free
energy, and includes a serious discussion of random vs. coherent motion
including diffusion. The development of instructional materials is coordinated
with careful education research. Both the new content and the results of the
research are described in a series of papers for which this paper serves as an
overview and context.Comment: 12 page
Sequential Bayesian updating for Big Data
The velocity, volume, and variety of big data present both challenges and opportunities for cognitive science. We introduce sequential Bayesian updat-ing as a tool to mine these three core properties. In the Bayesian approach, we summarize the current state of knowledge regarding parameters in terms of their posterior distributions, and use these as prior distributions when new data become available. Crucially, we construct posterior distributions in such a way that we avoid having to repeat computing the likelihood of old data as new data become available, allowing the propagation of information without great computational demand. As a result, these Bayesian methods allow continuous inference on voluminous information streams in a timely manner. We illustrate the advantages of sequential Bayesian updating with data from the MindCrowd project, in which crowd-sourced data are used to study Alzheimer’s Dementia. We fit an extended LATER (Linear Ap-proach to Threshold with Ergodic Rate) model to reaction time data from the project in order to separate two distinct aspects of cognitive functioning: speed of information accumulation and caution
- …