2,414 research outputs found
Reducing Sensitivity on Speaker Names for Text Generation from Dialogues
Changing speaker names consistently throughout a dialogue should not affect
its meaning and corresponding outputs for text generation from dialogues.
However, pre-trained language models, serving as the backbone for
dialogue-processing tasks, have shown to be sensitive to nuances. This may
result in unfairness in real-world applications. No comprehensive analysis of
this problem has been done in the past. In this work, we propose to
quantitatively measure a model's sensitivity on speaker names, and
comprehensively evaluate a number of known methods for reducing speaker name
sensitivity, including a novel approach of our own. Extensive experiments on
multiple datasets provide a benchmark for this problem and show the favorable
performance of our approach in sensitivity reduction and quality of generation.Comment: findings of ACL'2
Interplay between Quantum Size Effect and Strain Effect on Growth of Nanoscale Metal Thin Film
We develop a theoretical framework to investigate the interplay between
quantum size effect (QSE) and strain effect on the stability of metal
nanofilms. The QSE and strain effect are shown to be coupled through the
concept of "quantum electronic stress. First-principles calculations reveal
large quantum oscillations in the surface stress of metal nanofilms as a
function of film thickness. This adds extrinsically additional strain-coupled
quantum oscillations to surface energy of strained metal nanofilms. Our theory
enables a quantitative estimation of the amount of strain in experimental
samples, and suggests strain be an important factor contributing to the
discrepancies between the existing theories and experiments
In-sample Curriculum Learning by Sequence Completion for Natural Language Generation
Curriculum learning has shown promising improvements in multiple domains by
training machine learning models from easy samples to hard ones. Previous works
which either design rules or train models for scoring the difficulty highly
rely on task-specific expertise, and cannot generalize. Inspired by the
``easy-to-hard'' intuition, we propose to do in-sample curriculum learning for
natural language generation tasks. Our learning strategy starts training the
model to generate the last few words, i.e., do sequence completion, and
gradually extends to generate the whole output sequence. Comprehensive
experiments show that it generalizes well to different tasks and achieves
significant improvements over strong baselines
Electric Field Effect in Multilayer Cr2Ge2Te6: a Ferromagnetic Two-Dimensional Material
The emergence of two-dimensional (2D) materials has attracted a great deal of
attention due to their fascinating physical properties and potential
applications for future nanoelectronic devices. Since the first isolation of
graphene, a Dirac material, a large family of new functional 2D materials have
been discovered and characterized, including insulating 2D boron nitride,
semiconducting 2D transition metal dichalcogenides and black phosphorus, and
superconducting 2D bismuth strontium calcium copper oxide, molybdenum
disulphide and niobium selenide, etc. Here, we report the identification of
ferromagnetic thin flakes of Cr2Ge2Te6 (CGT) with thickness down to a few
nanometers, which provides a very important piece to the van der Waals
structures consisting of various 2D materials. We further demonstrate the giant
modulation of the channel resistance of 2D CGT devices via electric field
effect. Our results illustrate the gate voltage tunability of 2D CGT and the
potential of CGT, a ferromagnetic 2D material, as a new functional quantum
material for applications in future nanoelectronics and spintronics.Comment: To appear in 2D Material
- …