1,786 research outputs found
Maximum Entropy Vector Kernels for MIMO system identification
Recent contributions have framed linear system identification as a
nonparametric regularized inverse problem. Relying on -type
regularization which accounts for the stability and smoothness of the impulse
response to be estimated, these approaches have been shown to be competitive
w.r.t classical parametric methods. In this paper, adopting Maximum Entropy
arguments, we derive a new penalty deriving from a vector-valued
kernel; to do so we exploit the structure of the Hankel matrix, thus
controlling at the same time complexity, measured by the McMillan degree,
stability and smoothness of the identified models. As a special case we recover
the nuclear norm penalty on the squared block Hankel matrix. In contrast with
previous literature on reweighted nuclear norm penalties, our kernel is
described by a small number of hyper-parameters, which are iteratively updated
through marginal likelihood maximization; constraining the structure of the
kernel acts as a (hyper)regularizer which helps controlling the effective
degrees of freedom of our estimator. To optimize the marginal likelihood we
adapt a Scaled Gradient Projection (SGP) algorithm which is proved to be
significantly computationally cheaper than other first and second order
off-the-shelf optimization methods. The paper also contains an extensive
comparison with many state-of-the-art methods on several Monte-Carlo studies,
which confirms the effectiveness of our procedure
Hybrid Beamforming via the Kronecker Decomposition for the Millimeter-Wave Massive MIMO Systems
Despite its promising performance gain, the realization of mmWave massive
MIMO still faces several practical challenges. In particular, implementing
massive MIMO in the digital domain requires hundreds of RF chains matching the
number of antennas. Furthermore, designing these components to operate at the
mmWave frequencies is challenging and costly. These motivated the recent
development of hybrid-beamforming where MIMO processing is divided for separate
implementation in the analog and digital domains, called the analog and digital
beamforming, respectively. Analog beamforming using a phase array introduces
uni-modulus constraints on the beamforming coefficients, rendering the
conventional MIMO techniques unsuitable and call for new designs. In this
paper, we present a systematic design framework for hybrid beamforming for
multi-cell multiuser massive MIMO systems over mmWave channels characterized by
sparse propagation paths. The framework relies on the decomposition of analog
beamforming vectors and path observation vectors into Kronecker products of
factors being uni-modulus vectors. Exploiting properties of Kronecker mixed
products, different factors of the analog beamformer are designed for either
nulling interference paths or coherently combining data paths. Furthermore, a
channel estimation scheme is designed for enabling the proposed hybrid
beamforming. The scheme estimates the AoA of data and interference paths by
analog beam scanning and data-path gains by analog beam steering. The
performance of the channel estimation scheme is analyzed. In particular, the
AoA spectrum resulting from beam scanning, which displays the magnitude
distribution of paths over the AoA range, is derived in closed-form. It is
shown that the inter-cell interference level diminishes inversely with the
array size, the square root of pilot sequence length and the spatial separation
between paths.Comment: Submitted to IEEE JSAC Special Issue on Millimeter Wave
Communications for Future Mobile Networks, minor revisio
Regularization and Bayesian Learning in Dynamical Systems: Past, Present and Future
Regularization and Bayesian methods for system identification have been
repopularized in the recent years, and proved to be competitive w.r.t.
classical parametric approaches. In this paper we shall make an attempt to
illustrate how the use of regularization in system identification has evolved
over the years, starting from the early contributions both in the Automatic
Control as well as Econometrics and Statistics literature. In particular we
shall discuss some fundamental issues such as compound estimation problems and
exchangeability which play and important role in regularization and Bayesian
approaches, as also illustrated in early publications in Statistics. The
historical and foundational issues will be given more emphasis (and space), at
the expense of the more recent developments which are only briefly discussed.
The main reason for such a choice is that, while the recent literature is
readily available, and surveys have already been published on the subject, in
the author's opinion a clear link with past work had not been completely
clarified.Comment: Plenary Presentation at the IFAC SYSID 2015. Submitted to Annual
Reviews in Contro
Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint
Inspired by ideas taken from the machine learning literature, new
regularization techniques have been recently introduced in linear system
identification. In particular, all the adopted estimators solve a regularized
least squares problem, differing in the nature of the penalty term assigned to
the impulse response. Popular choices include atomic and nuclear norms (applied
to Hankel matrices) as well as norms induced by the so called stable spline
kernels. In this paper, a comparative study of estimators based on these
different types of regularizers is reported. Our findings reveal that stable
spline kernels outperform approaches based on atomic and nuclear norms since
they suitably embed information on impulse response stability and smoothness.
This point is illustrated using the Bayesian interpretation of regularization.
We also design a new class of regularizers defined by "integral" versions of
stable spline/TC kernels. Under quite realistic experimental conditions, the
new estimators outperform classical prediction error methods also when the
latter are equipped with an oracle for model order selection
The edge cloud: A holistic view of communication, computation and caching
The evolution of communication networks shows a clear shift of focus from
just improving the communications aspects to enabling new important services,
from Industry 4.0 to automated driving, virtual/augmented reality, Internet of
Things (IoT), and so on. This trend is evident in the roadmap planned for the
deployment of the fifth generation (5G) communication networks. This ambitious
goal requires a paradigm shift towards a vision that looks at communication,
computation and caching (3C) resources as three components of a single holistic
system. The further step is to bring these 3C resources closer to the mobile
user, at the edge of the network, to enable very low latency and high
reliability services. The scope of this chapter is to show that signal
processing techniques can play a key role in this new vision. In particular, we
motivate the joint optimization of 3C resources. Then we show how graph-based
representations can play a key role in building effective learning methods and
devising innovative resource allocation techniques.Comment: to appear in the book "Cooperative and Graph Signal Pocessing:
Principles and Applications", P. Djuric and C. Richard Eds., Academic Press,
Elsevier, 201
- …