181,388 research outputs found
Perturbations of Functional Inequalities for L\'evy Type Dirichlet Forms
Perturbations of super Poincar\'e and weak Poincar\'e inequalities for L\'evy
type Dirichlet forms are studied. When the range of jumps is finite our results
are natural extensions to the corresponding ones derived earlier for diffusion
processes; and we show that the study for the situation with infinite range of
jumps is essentially different. Some examples are presented to illustrate the
optimality of our results
XL-NBT: A Cross-lingual Neural Belief Tracking Framework
Task-oriented dialog systems are becoming pervasive, and many companies
heavily rely on them to complement human agents for customer service in call
centers. With globalization, the need for providing cross-lingual customer
support becomes more urgent than ever. However, cross-lingual support poses
great challenges---it requires a large amount of additional annotated data from
native speakers. In order to bypass the expensive human annotation and achieve
the first step towards the ultimate goal of building a universal dialog system,
we set out to build a cross-lingual state tracking framework. Specifically, we
assume that there exists a source language with dialog belief tracking
annotations while the target languages have no annotated dialog data of any
form. Then, we pre-train a state tracker for the source language as a teacher,
which is able to exploit easy-to-access parallel data. We then distill and
transfer its own knowledge to the student state tracker in target languages. We
specifically discuss two types of common parallel resources: bilingual corpus
and bilingual dictionary, and design different transfer learning strategies
accordingly. Experimentally, we successfully use English state tracker as the
teacher to transfer its knowledge to both Italian and German trackers and
achieve promising results.Comment: 13 pages, 5 figures, 3 tables, accepted to EMNLP 2018 conferenc
LSTD: A Low-Shot Transfer Detector for Object Detection
Recent advances in object detection are mainly driven by deep learning with
large-scale detection benchmarks. However, the fully-annotated training set is
often limited for a target detection task, which may deteriorate the
performance of deep detectors. To address this challenge, we propose a novel
low-shot transfer detector (LSTD) in this paper, where we leverage rich
source-domain knowledge to construct an effective target-domain detector with
very few training examples. The main contributions are described as follows.
First, we design a flexible deep architecture of LSTD to alleviate transfer
difficulties in low-shot detection. This architecture can integrate the
advantages of both SSD and Faster RCNN in a unified deep framework. Second, we
introduce a novel regularized transfer learning framework for low-shot
detection, where the transfer knowledge (TK) and background depression (BD)
regularizations are proposed to leverage object knowledge respectively from
source and target domains, in order to further enhance fine-tuning with a few
target images. Finally, we examine our LSTD on a number of challenging low-shot
detection experiments, where LSTD outperforms other state-of-the-art
approaches. The results demonstrate that LSTD is a preferable deep detector for
low-shot scenarios.Comment: Accepted by AAAI201
- …