96 research outputs found
Cricket and the Law
Book review of Cricket and the Law by David Fraser and published by The Institute of Criminology (Sydney), 1993. (273 pp.
Cricket and the Law
Book review of Cricket and the Law by David Fraser and published by The Institute of Criminology (Sydney), 1993. (273 pp.
Just Add Functions: A Neural-Symbolic Language Model
Neural network language models (NNLMs) have achieved ever-improving accuracy
due to more sophisticated architectures and increasing amounts of training
data. However, the inductive bias of these models (formed by the distributional
hypothesis of language), while ideally suited to modeling most running text,
results in key limitations for today's models. In particular, the models often
struggle to learn certain spatial, temporal, or quantitative relationships,
which are commonplace in text and are second-nature for human readers. Yet, in
many cases, these relationships can be encoded with simple mathematical or
logical expressions. How can we augment today's neural models with such
encodings?
In this paper, we propose a general methodology to enhance the inductive bias
of NNLMs by incorporating simple functions into a neural architecture to form a
hierarchical neural-symbolic language model (NSLM). These functions explicitly
encode symbolic deterministic relationships to form probability distributions
over words. We explore the effectiveness of this approach on numbers and
geographic locations, and show that NSLMs significantly reduce perplexity in
small-corpus language modeling, and that the performance improvement persists
for rare tokens even on much larger corpora. The approach is simple and
general, and we discuss how it can be applied to other word classes beyond
numbers and geography.Comment: Preprint of paper accepted for AAAI-202
CascadER: Cross-Modal Cascading for Knowledge Graph Link Prediction
Knowledge graph (KG) link prediction is a fundamental task in artificial
intelligence, with applications in natural language processing, information
retrieval, and biomedicine. Recently, promising results have been achieved by
leveraging cross-modal information in KGs, using ensembles that combine
knowledge graph embeddings (KGEs) and contextual language models (LMs).
However, existing ensembles are either (1) not consistently effective in terms
of ranking accuracy gains or (2) impractically inefficient on larger datasets
due to the combinatorial explosion problem of pairwise ranking with deep
language models. In this paper, we propose a novel tiered ranking architecture
CascadER to maintain the ranking accuracy of full ensembling while improving
efficiency considerably. CascadER uses LMs to rerank the outputs of more
efficient base KGEs, relying on an adaptive subset selection scheme aimed at
invoking the LMs minimally while maximizing accuracy gain over the KGE.
Extensive experiments demonstrate that CascadER improves MRR by up to 9 points
over KGE baselines, setting new state-of-the-art performance on four benchmarks
while improving efficiency by one or more orders of magnitude over competitive
cross-modal baselines. Our empirical analyses reveal that diversity of models
across modalities and preservation of individual models' confidence signals
help explain the effectiveness of CascadER, and suggest promising directions
for cross-modal cascaded architectures. Code and pretrained models are
available at https://github.com/tsafavi/cascader.Comment: AKBC 202
- …