8,586 research outputs found
Reproducing pairs of measurable functions and partial inner product spaces
We continue the analysis of reproducing pairs of weakly measurable functions,
which generalize continuous frames. More precisely, we examine the case where
the defining measurable functions take their values in a partial inner product
space (PIP spaces). Several examples, both discrete and continuous, are
presented.Comment: 20 pages, 1 figure. arXiv admin note: substantial text overlap with
arXiv:1505.0418
Partial inner product spaces, metric operators and generalized hermiticity
Motivated by the recent developments of pseudo-hermitian quantum mechanics,
we analyze the structure of unbounded metric operators in a Hilbert space. It
turns out that such operators generate a canonical lattice of Hilbert spaces,
that is, the simplest case of a partial inner product space (PIP space). Next,
we introduce several generalizations of the notion of similarity between
operators and explore to what extend they preserve spectral properties. Then we
apply some of the previous results to operators on a particular PIP space,
namely, a scale of Hilbert spaces generated by a metric operator. Finally, we
reformulate the notion of pseudo-hermitian operators in the preceding
formalism.Comment: 23 pages, 4 figures (in LaTeX
Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding
Abstractive community detection is an important spoken language understanding
task, whose goal is to group utterances in a conversation according to whether
they can be jointly summarized by a common abstractive sentence. This paper
provides a novel approach to this task. We first introduce a neural contextual
utterance encoder featuring three types of self-attention mechanisms. We then
train it using the siamese and triplet energy-based meta-architectures.
Experiments on the AMI corpus show that our system outperforms multiple
energy-based and non-energy based baselines from the state-of-the-art. Code and
data are publicly available.Comment: Update baseline
Compositional abstraction and safety synthesis using overlapping symbolic models
In this paper, we develop a compositional approach to abstraction and safety
synthesis for a general class of discrete time nonlinear systems. Our approach
makes it possible to define a symbolic abstraction by composing a set of
symbolic subsystems that are overlapping in the sense that they can share some
common state variables. We develop compositional safety synthesis techniques
using such overlapping symbolic subsystems. Comparisons, in terms of
conservativeness and of computational complexity, between abstractions and
controllers obtained from different system decompositions are provided.
Numerical experiments show that the proposed approach for symbolic control
synthesis enables a significant complexity reduction with respect to the
centralized approach, while reducing the conservatism with respect to
compositional approaches using non-overlapping subsystems
A global model for the combustion of gas mixtures released from forest fuels
International audienc
Wavelets on the Two-Sphere and Other Conic Sections
We survey the construction of the continuous wavelet transform (CWT) on the twosphere. Then we discuss the discretization of the spherical CWT, obtaining various types of discrete frames. Finally, we give some indications on the construction of a CWT on other conic section
Classification of General Sequences by Frame-Related Operators
This note is a survey and collection of results, as well as presenting some
original research. For Bessel sequences and frames, the analysis, synthesis and
frame operators as well as the Gram matrix are well-known, bounded operators.
We investigate these operators for arbitrary sequences, which in general lead
to possibly unbounded operators. We characterize various classes of sequences
in terms of these operators and vice-versa. Finally, we classify these
sequences by operators applied on orthonormal bases
Graph Classification with 2D Convolutional Neural Networks
Graph learning is currently dominated by graph kernels, which, while
powerful, suffer some significant limitations. Convolutional Neural Networks
(CNNs) offer a very appealing alternative, but processing graphs with CNNs is
not trivial. To address this challenge, many sophisticated extensions of CNNs
have recently been introduced. In this paper, we reverse the problem: rather
than proposing yet another graph CNN model, we introduce a novel way to
represent graphs as multi-channel image-like structures that allows them to be
handled by vanilla 2D CNNs. Experiments reveal that our method is more accurate
than state-of-the-art graph kernels and graph CNNs on 4 out of 6 real-world
datasets (with and without continuous node attributes), and close elsewhere.
Our approach is also preferable to graph kernels in terms of time complexity.
Code and data are publicly available.Comment: Published at ICANN 201
Frames and Semi-Frames
Loosely speaking, a semi-frame is a generalized frame for which one of the
frame bounds is absent. More precisely, given a total sequence in a Hilbert
space, we speak of an upper (resp. lower) semi-frame if only the upper (resp.
lower) frame bound is valid. Equivalently, for an upper semi-frame, the frame
operator is bounded, but has an unbounded inverse, whereas a lower semi-frame
has an unbounded frame operator, with bounded inverse. We study mostly upper
semi-frames, both in the continuous case and in the discrete case, and give
some remarks for the dual situation. In particular, we show that reconstruction
is still possible in certain cases.Comment: 25 page
- …