30,010 research outputs found
Large Margin Neural Language Model
We propose a large margin criterion for training neural language models.
Conventionally, neural language models are trained by minimizing perplexity
(PPL) on grammatical sentences. However, we demonstrate that PPL may not be the
best metric to optimize in some tasks, and further propose a large margin
formulation. The proposed method aims to enlarge the margin between the "good"
and "bad" sentences in a task-specific sense. It is trained end-to-end and can
be widely applied to tasks that involve re-scoring of generated text. Compared
with minimum-PPL training, our method gains up to 1.1 WER reduction for speech
recognition and 1.0 BLEU increase for machine translation.Comment: 9 pages. Accepted as a long paper in EMNLP201
Non-Commutativity, Teleology and GRB Time Delay
We propose a model in which an energy-dependent time delay of a photon
originates from space-time non-commutativity, the time delay is due to a
noncommutative coupling between dilaton and photon. We predict that in our
model, high energy photons with different momenta can either be delayed or
superluminal, this may be related to a possible time delay reported by the
Fermi LAT and Fermi GBM Collaborations.Comment: 8 pages, 1 figure, typo revised, contents and reference adde
- …