11,068 research outputs found
Dynamics and Performance of Susceptibility Propagation on Synthetic Data
We study the performance and convergence properties of the Susceptibility
Propagation (SusP) algorithm for solving the Inverse Ising problem. We first
study how the temperature parameter (T) in a Sherrington-Kirkpatrick model
generating the data influences the performance and convergence of the
algorithm. We find that at the high temperature regime (T>4), the algorithm
performs well and its quality is only limited by the quality of the supplied
data. In the low temperature regime (T<4), we find that the algorithm typically
does not converge, yielding diverging values for the couplings. However, we
show that by stopping the algorithm at the right time before divergence becomes
serious, good reconstruction can be achieved down to T~2. We then show that
dense connectivity, loopiness of the connectivity, and high absolute
magnetization all have deteriorating effects on the performance of the
algorithm. When absolute magnetization is high, we show that other methods can
be work better than SusP. Finally, we show that for neural data with high
absolute magnetization, SusP performs less well than TAP inversion.Comment: 9 pages, 7 figure
Cycle-based Cluster Variational Method for Direct and Inverse Inference
We elaborate on the idea that loop corrections to belief propagation could be
dealt with in a systematic way on pairwise Markov random fields, by using the
elements of a cycle basis to define region in a generalized belief propagation
setting. The region graph is specified in such a way as to avoid dual loops as
much as possible, by discarding redundant Lagrange multipliers, in order to
facilitate the convergence, while avoiding instabilities associated to minimal
factor graph construction. We end up with a two-level algorithm, where a belief
propagation algorithm is run alternatively at the level of each cycle and at
the inter-region level. The inverse problem of finding the couplings of a
Markov random field from empirical covariances can be addressed region wise. It
turns out that this can be done efficiently in particular in the Ising context,
where fixed point equations can be derived along with a one-parameter log
likelihood function to minimize. Numerical experiments confirm the
effectiveness of these considerations both for the direct and inverse MRF
inference.Comment: 47 pages, 16 figure
- …