16,497 research outputs found
Approximate renormalization for the break-up of invariant tori with three frequencies
We construct an approximate renormalization transformation for Hamiltonian
systems with three degrees of freedom in order to study the break-up of
invariant tori with three incommensurate frequencies which belong to the cubic
field , where . This renormalization has two
fixed points~: a stable one and a hyperbolic one with a codimension one stable
manifold. We compute the associated critical exponents that characterize the
universality class for the break-up of the invariant tori we consider.Comment: 5 pages, REVTe
A practical Bayesian framework for backpropagation networks
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible (1) objective comparisons between solutions using alternative network architectures, (2) objective stopping rules for network pruning or growing procedures, (3) objective choice of magnitude and type of weight decay terms or additive regularizers (for penalizing large weights, etc.), (4) a measure of the effective number of well-determined parameters in a model, (5) quantified estimates of the error bars on network parameters and on network output, and (6) objective comparisons with alternative learning and interpolation models such as splines and radial basis functions. The Bayesian "evidence" automatically embodies "Occam's razor," penalizing overflexible and overcomplex models. The Bayesian approach helps detect poor underlying assumptions in learning models. For learning models well matched to a problem, a good correlation between generalization ability and the Bayesian evidence is obtained
Bayesian interpolation
Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. In this paper, the Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other data modeling problems. Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. âOccam's razorâ is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling
Information-based objective functions for active data selection
Learning can be made more efficient if we can actively select particularly salient data points. Within a Bayesian learning framework, objective functions are discussed that measure the expected informativeness of candidate measurements. Three alternative specifications of what we want to gain information about lead to three different criteria for data selection. All these criteria depend on the assumption that the hypothesis space is correct, which may prove to be their main weakness
Quick disconnect latch and handle combination Patent
Quick disconnect latch and handle combination for mounting articles on walls or supporting bases in spacecraft under zero gravity condition
Diffraction-limited CCD imaging with faint reference stars
By selecting short exposure images taken using a CCD with negligible readout
noise we obtained essentially diffraction-limited 810 nm images of faint
objects using nearby reference stars brighter than I=16 at a 2.56 m telescope.
The FWHM of the isoplanatic patch for the technique is found to be 50
arcseconds, providing ~20% sky coverage around suitable reference stars.Comment: 4 page letter accepted for publication in Astronomy and Astrophysic
Analysis of Linsker's simulations of Hebbian rules
Linsker has reported the development of center-surround receptive fields and oriented receptive fields in simulations of a Hebb-type equation in a linear network. The dynamics of the learning rule are analyzed in terms of the eigenvectors of the covariance matrix of cell activities. Analytic and computational results for Linsker's covariance matrices, and some general theorems, lead to an explanation of the emergence of center-surround and certain oriented structures. We estimate criteria for the parameter regime in which center-surround structures emerge
The Role of Constraints in Hebbian Learning
Models of unsupervised, correlation-based (Hebbian) synaptic plasticity are typically unstable: either all synapses grow until each reaches the maximum allowed strength, or all synapses decay to zero strength. A common method of avoiding these outcomes is to use a constraint that conserves or limits the total synaptic strength over a cell. We study the dynamic effects of such constraints.
Two methods of enforcing a constraint are distinguished, multiplicative and subtractive. For otherwise linear learning rules, multiplicative enforcement of a constraint results in dynamics that converge to the principal eigenvector of the operator determining unconstrained synaptic development. Subtractive enforcement, in contrast, typically leads to a final state in which almost all synaptic strengths reach either the maximum or minimum allowed value. This final state is often dominated by weight configurations other than the principal eigenvector of the unconstrained operator. Multiplicative enforcement yields a âgradedâ receptive field in which most mutually correlated inputs are represented, whereas subtractive enforcement yields a receptive field that is âsharpenedâ to a subset of maximally correlated inputs. If two equivalent input populations (e.g., two eyes) innervate a common target, multiplicative enforcement prevents their segregation (ocular dominance segregation) when the two populations are weakly correlated; whereas subtractive enforcement allows segregation under these circumstances.
These results may be used to understand constraints both over output cells and over input cells. A variety of rules that can implement constrained dynamics are discussed
- âŠ