8,529 research outputs found
Network Community Detection on Metric Space
Community detection in a complex network is an important problem of much
interest in recent years. In general, a community detection algorithm chooses
an objective function and captures the communities of the network by optimizing
the objective function, and then, one uses various heuristics to solve the
optimization problem to extract the interesting communities for the user. In
this article, we demonstrate the procedure to transform a graph into points of
a metric space and develop the methods of community detection with the help of
a metric defined for a pair of points. We have also studied and analyzed the
community structure of the network therein. The results obtained with our
approach are very competitive with most of the well-known algorithms in the
literature, and this is justified over the large collection of datasets. On the
other hand, it can be observed that time taken by our algorithm is quite less
compared to other methods and justifies the theoretical findings
Sketching for Large-Scale Learning of Mixture Models
Learning parameters from voluminous data can be prohibitive in terms of
memory and computational requirements. We propose a "compressive learning"
framework where we estimate model parameters from a sketch of the training
data. This sketch is a collection of generalized moments of the underlying
probability distribution of the data. It can be computed in a single pass on
the training set, and is easily computable on streams or distributed datasets.
The proposed framework shares similarities with compressive sensing, which aims
at drastically reducing the dimension of high-dimensional signals while
preserving the ability to reconstruct them. To perform the estimation task, we
derive an iterative algorithm analogous to sparse reconstruction algorithms in
the context of linear inverse problems. We exemplify our framework with the
compressive estimation of a Gaussian Mixture Model (GMM), providing heuristics
on the choice of the sketching procedure and theoretical guarantees of
reconstruction. We experimentally show on synthetic data that the proposed
algorithm yields results comparable to the classical Expectation-Maximization
(EM) technique while requiring significantly less memory and fewer computations
when the number of database elements is large. We further demonstrate the
potential of the approach on real large-scale data (over 10 8 training samples)
for the task of model-based speaker verification. Finally, we draw some
connections between the proposed framework and approximate Hilbert space
embedding of probability distributions using random features. We show that the
proposed sketching operator can be seen as an innovative method to design
translation-invariant kernels adapted to the analysis of GMMs. We also use this
theoretical framework to derive information preservation guarantees, in the
spirit of infinite-dimensional compressive sensing
Neural Connectivity with Hidden Gaussian Graphical State-Model
The noninvasive procedures for neural connectivity are under questioning.
Theoretical models sustain that the electromagnetic field registered at
external sensors is elicited by currents at neural space. Nevertheless, what we
observe at the sensor space is a superposition of projected fields, from the
whole gray-matter. This is the reason for a major pitfall of noninvasive
Electrophysiology methods: distorted reconstruction of neural activity and its
connectivity or leakage. It has been proven that current methods produce
incorrect connectomes. Somewhat related to the incorrect connectivity
modelling, they disregard either Systems Theory and Bayesian Information
Theory. We introduce a new formalism that attains for it, Hidden Gaussian
Graphical State-Model (HIGGS). A neural Gaussian Graphical Model (GGM) hidden
by the observation equation of Magneto-encephalographic (MEEG) signals. HIGGS
is equivalent to a frequency domain Linear State Space Model (LSSM) but with
sparse connectivity prior. The mathematical contribution here is the theory for
high-dimensional and frequency-domain HIGGS solvers. We demonstrate that HIGGS
can attenuate the leakage effect in the most critical case: the distortion EEG
signal due to head volume conduction heterogeneities. Its application in EEG is
illustrated with retrieved connectivity patterns from human Steady State Visual
Evoked Potentials (SSVEP). We provide for the first time confirmatory evidence
for noninvasive procedures of neural connectivity: concurrent EEG and
Electrocorticography (ECoG) recordings on monkey. Open source packages are
freely available online, to reproduce the results presented in this paper and
to analyze external MEEG databases
Bayesian Semiparametric Hierarchical Empirical Likelihood Spatial Models
We introduce a general hierarchical Bayesian framework that incorporates a
flexible nonparametric data model specification through the use of empirical
likelihood methodology, which we term semiparametric hierarchical empirical
likelihood (SHEL) models. Although general dependence structures can be readily
accommodated, we focus on spatial modeling, a relatively underdeveloped area in
the empirical likelihood literature. Importantly, the models we develop
naturally accommodate spatial association on irregular lattices and irregularly
spaced point-referenced data. We illustrate our proposed framework by means of
a simulation study and through three real data examples. First, we develop a
spatial Fay-Herriot model in the SHEL framework and apply it to the problem of
small area estimation in the American Community Survey. Next, we illustrate the
SHEL model in the context of areal data (on an irregular lattice) through the
North Carolina sudden infant death syndrome (SIDS) dataset. Finally, we analyze
a point-referenced dataset from the North American Breeding Bird survey that
considers dove counts for the state of Missouri. In all cases, we demonstrate
superior performance of our model, in terms of mean squared prediction error,
over standard parametric analyses.Comment: 29 pages, 3 figue
- …