29,372 research outputs found
Error threshold in optimal coding, numerical criteria and classes of universalities for complexity
The free energy of the Random Energy Model at the transition point between
ferromagnetic and spin glass phases is calculated. At this point, equivalent to
the decoding error threshold in optimal codes, free energy has finite size
corrections proportional to the square root of the number of degrees. The
response of the magnetization to the ferromagnetic couplings is maximal at the
values of magnetization equal to half. We give several criteria of complexity
and define different universality classes. According to our classification, at
the lowest class of complexity are random graph, Markov Models and Hidden
Markov Models. At the next level is Sherrington-Kirkpatrick spin glass,
connected with neuron-network models. On a higher level are critical theories,
spin glass phase of Random Energy Model, percolation, self organized
criticality (SOC). The top level class involves HOT design, error threshold in
optimal coding, language, and, maybe, financial market. Alive systems are also
related with the last class. A concept of anti-resonance is suggested for the
complex systems.Comment: 17 page
Hamiltonian Monte Carlo Acceleration Using Surrogate Functions with Random Bases
For big data analysis, high computational cost for Bayesian methods often
limits their applications in practice. In recent years, there have been many
attempts to improve computational efficiency of Bayesian inference. Here we
propose an efficient and scalable computational technique for a
state-of-the-art Markov Chain Monte Carlo (MCMC) methods, namely, Hamiltonian
Monte Carlo (HMC). The key idea is to explore and exploit the structure and
regularity in parameter space for the underlying probabilistic model to
construct an effective approximation of its geometric properties. To this end,
we build a surrogate function to approximate the target distribution using
properly chosen random bases and an efficient optimization process. The
resulting method provides a flexible, scalable, and efficient sampling
algorithm, which converges to the correct target distribution. We show that by
choosing the basis functions and optimization process differently, our method
can be related to other approaches for the construction of surrogate functions
such as generalized additive models or Gaussian process models. Experiments
based on simulated and real data show that our approach leads to substantially
more efficient sampling algorithms compared to existing state-of-the art
methods
Handover Management in Highly Dense Femtocellular Networks
For dense femtocells, intelligent integrated femtocell/macrocell network
architecture, a neighbor cell list with a minimum number of femtocells,
effective call admission control (CAC), and handover processes with proper
signaling are the open research issues. An appropriate traffic model for the
integrated femtocell/macrocell network is also not yet developed. In this
paper, we present the major issue of mobility management for the integrated
femtocell/macrocell network. We propose a novel algorithm to create a neighbor
cell list with a minimum, but appropriate, number of cells for handover. We
also propose detailed handover procedures and a novel traffic model for the
integrated femtocell/macrocell network. The proposed CAC effectively handles
various calls. The numerical and simulation results show the importance of the
integrated femtocell/macrocell network and the performance improvement of the
proposed schemes. Our proposed schemes for dense femtocells will be very
effective for those in research and industry to implement
Sequential Bayesian inference for implicit hidden Markov models and current limitations
Hidden Markov models can describe time series arising in various fields of
science, by treating the data as noisy measurements of an arbitrarily complex
Markov process. Sequential Monte Carlo (SMC) methods have become standard tools
to estimate the hidden Markov process given the observations and a fixed
parameter value. We review some of the recent developments allowing the
inclusion of parameter uncertainty as well as model uncertainty. The
shortcomings of the currently available methodology are emphasised from an
algorithmic complexity perspective. The statistical objects of interest for
time series analysis are illustrated on a toy "Lotka-Volterra" model used in
population ecology. Some open challenges are discussed regarding the
scalability of the reviewed methodology to longer time series,
higher-dimensional state spaces and more flexible models.Comment: Review article written for ESAIM: proceedings and surveys. 25 pages,
10 figure
- …