7,267 research outputs found
Mckay Correspondence in Quasitoric Orbifolds
We show Mckay correspondence of Betti numbers of Chen-Ruan coho- mology for
omnioriented quasitoric orbifolds. In previous articles with M. Poddar [8],
[9], we proved the correspondence for four dimension and six dimensions. Here
we deal with the general case.Comment: 11 page
Phase Transition in a Long Range Antiferromagnetic Model
We consider an Ising model where longitudinal components of every pair of
spins have antiferromagnetic interaction of the same magnitude. When subjected
to a transverse magnetic field at zero temperature, the system undergoes a
phase transition of second order to an ordered phase and if the temperature is
now increased, there is another phase transition to disordered phase. We
provide derivation of these features by perturbative treatment up to the second
order and argue that the results are non-trivial and not derivable from the
known results about related models.Comment: 3 page
SuperSpike: Supervised learning in multi-layer spiking neural networks
A vast majority of computation in the brain is performed by spiking neural
networks. Despite the ubiquity of such spiking, we currently lack an
understanding of how biological spiking neural circuits learn and compute
in-vivo, as well as how we can instantiate such capabilities in artificial
spiking circuits in-silico. Here we revisit the problem of supervised learning
in temporally coding multi-layer spiking neural networks. First, by using a
surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based
three factor learning rule capable of training multi-layer networks of
deterministic integrate-and-fire neurons to perform nonlinear computations on
spatiotemporal spike patterns. Second, inspired by recent results on feedback
alignment, we compare the performance of our learning rule under different
credit assignment strategies for propagating output errors to hidden units.
Specifically, we test uniform, symmetric and random feedback, finding that
simpler tasks can be solved with any type of feedback, while more complex tasks
require symmetric feedback. In summary, our results open the door to obtaining
a better scientific understanding of learning and computation in spiking neural
networks by advancing our ability to train them to solve nonlinear problems
involving transformations between different spatiotemporal spike-time patterns
Statistical Mechanics of High-Dimensional Inference
To model modern large-scale datasets, we need efficient algorithms to infer a
set of unknown model parameters from noisy measurements. What are
fundamental limits on the accuracy of parameter inference, given finite
signal-to-noise ratios, limited measurements, prior information, and
computational tractability requirements? How can we combine prior information
with measurements to achieve these limits? Classical statistics gives incisive
answers to these questions as the measurement density . However, these classical results are not
relevant to modern high-dimensional inference problems, which instead occur at
finite . We formulate and analyze high-dimensional inference as a
problem in the statistical physics of quenched disorder. Our analysis uncovers
fundamental limits on the accuracy of inference in high dimensions, and reveals
that widely cherished inference algorithms like maximum likelihood (ML) and
maximum-a posteriori (MAP) inference cannot achieve these limits. We further
find optimal, computationally tractable algorithms that can achieve these
limits. Intriguingly, in high dimensions, these optimal algorithms become
computationally simpler than MAP and ML, while still outperforming them. For
example, such optimal algorithms can lead to as much as a 20% reduction in the
amount of data to achieve the same performance relative to MAP. Moreover, our
analysis reveals simple relations between optimal high dimensional inference
and low dimensional scalar Bayesian inference, insights into the nature of
generalization and predictive power in high dimensions, information theoretic
limits on compressed sensing, phase transitions in quadratic inference, and
connections to central mathematical objects in convex optimization theory and
random matrix theory.Comment: See http://ganguli-gang.stanford.edu/pdf/HighDimInf.Supp.pdf for
supplementary materia
- …