1,313 research outputs found
State-Space Network Topology Identification from Partial Observations
In this work, we explore the state-space formulation of a network process to
recover, from partial observations, the underlying network topology that drives
its dynamics. To do so, we employ subspace techniques borrowed from system
identification literature and extend them to the network topology
identification problem. This approach provides a unified view of the
traditional network control theory and signal processing on graphs. In
addition, it provides theoretical guarantees for the recovery of the
topological structure of a deterministic continuous-time linear dynamical
system from input-output observations even though the input and state
interaction networks might be different. The derived mathematical analysis is
accompanied by an algorithm for identifying, from data, a network topology
consistent with the dynamics of the system and conforms to the prior
information about the underlying structure. The proposed algorithm relies on
alternating projections and is provably convergent. Numerical results
corroborate the theoretical findings and the applicability of the proposed
algorithm.Comment: 13 pages, 3 appendix page
Graph Signal Processing: Overview, Challenges and Applications
Research in Graph Signal Processing (GSP) aims to develop tools for
processing data defined on irregular graph domains. In this paper we first
provide an overview of core ideas in GSP and their connection to conventional
digital signal processing. We then summarize recent developments in developing
basic GSP tools, including methods for sampling, filtering or graph learning.
Next, we review progress in several application areas using GSP, including
processing and analysis of sensor network data, biological data, and
applications to image processing and machine learning. We finish by providing a
brief historical perspective to highlight how concepts recently developed in
GSP build on top of prior research in other areas.Comment: To appear, Proceedings of the IEE
Recommended from our members
Graph-based learning under perturbations via total least-squares
Graphs are pervasive in different fields unveiling complex relationships between data. Two major graph-based learning tasks are topology identification and inference of signals over graphs. Among the possible models to explain data interdependencies, structural equation models (SEMs) accommodate a gamut of applications involving topology identification. Obtaining conventional SEMs though requires measurements across nodes. On the other hand, typical signal inference approaches “blindly trust” a given nominal topology. In practice however, signal or topology perturbations may be present in both tasks, due to model mismatch, outliers, outages or adversarial behavior. To cope with such perturbations, this work introduces a regularized total least-squares (TLS) approach and iterative algorithms with convergence guarantees to solve both tasks. Further generalizations are also considered relying on structured and/or weighted TLS when extra prior information on the perturbation is available. Analyses with simulated and real data corroborate the effectiveness of the novel TLS-based approaches
Online Joint Topology Identification and Signal Estimation with Inexact Proximal Online Gradient Descent
Identifying the topology that underlies a set of time series is useful for
tasks such as prediction, denoising, and data completion. Vector autoregressive
(VAR) model based topologies capture dependencies among time series, and are
often inferred from observed spatio-temporal data. When the data are affected
by noise and/or missing samples, the tasks of topology identification and
signal recovery (reconstruction) have to be performed jointly. Additional
challenges arise when i) the underlying topology is time-varying, ii) data
become available sequentially, and iii) no delay is tolerated. To overcome
these challenges, this paper proposes two online algorithms to estimate the VAR
model-based topologies. The proposed algorithms have constant complexity per
iteration, which makes them interesting for big data scenarios. They also enjoy
complementary merits in terms of complexity and performance. A performance
guarantee is derived for one of the algorithms in the form of a dynamic regret
bound. Numerical tests are also presented, showcasing the ability of the
proposed algorithms to track the time-varying topologies with missing data in
an online fashion.Comment: 14 pages including supplementary material, 2 figures, submitted to
IEEE Transactions on Signal Processin
Kernel-based graph learning from smooth signals: a functional viewpoint
The problem of graph learning concerns the construction of an explicit topological structure revealing the relationship between nodes representing data entities, which plays an increasingly important role in the success of many graph-based representations and algorithms in the field of machine learning and graph signal processing. In this paper, we propose a novel graph learning framework that incorporates prior information along node and observation side, and in particular the covariates that help to explain the dependency structures in graph signals. To this end, we consider graph signals as functions in the reproducing kernel Hilbert space associated with a Kronecker product kernel, and integrate functional learning with smoothness-promoting graph learning to learn a graph representing the relationship between nodes. The functional learning increases the robustness of graph learning against missing and incomplete information in the graph signals. In addition, we develop a novel graph-based regularisation method which, when combined with the Kronecker product kernel, enables our model to capture both the dependency explained by the graph and the dependency due to graph signals observed under different but related circumstances, e.g. different points in time. The latter means the graph signals are free from the i.i.d. assumptions required by the classical graph learning models. Experiments on both synthetic and real-world data show that our methods outperform the state-of-the-art models in learning a meaningful graph topology from graph signals, in particular with heavy noise, missing values, and multiple dependency
Kernel-based Graph Learning from Smooth Signals: A Functional Viewpoint
The problem of graph learning concerns the construction of an explicit
topological structure revealing the relationship between nodes representing
data entities, which plays an increasingly important role in the success of
many graph-based representations and algorithms in the field of machine
learning and graph signal processing. In this paper, we propose a novel graph
learning framework that incorporates the node-side and observation-side
information, and in particular the covariates that help to explain the
dependency structures in graph signals. To this end, we consider graph signals
as functions in the reproducing kernel Hilbert space associated with a
Kronecker product kernel, and integrate functional learning with
smoothness-promoting graph learning to learn a graph representing the
relationship between nodes. The functional learning increases the robustness of
graph learning against missing and incomplete information in the graph signals.
In addition, we develop a novel graph-based regularisation method which, when
combined with the Kronecker product kernel, enables our model to capture both
the dependency explained by the graph and the dependency due to graph signals
observed under different but related circumstances, e.g. different points in
time. The latter means the graph signals are free from the i.i.d. assumptions
required by the classical graph learning models. Experiments on both synthetic
and real-world data show that our methods outperform the state-of-the-art
models in learning a meaningful graph topology from graph signals, in
particular under heavy noise, missing values, and multiple dependency.Comment: 13 pages, with extra 3-page appendice
- …