84,725 research outputs found
Challenges and opportunities of context-aware information access
Ubiquitous computing environments embedding a wide range of pervasive computing technologies provide a challenging and exciting new domain for information access. Individuals working in these environments are increasingly permanently connected to rich information resources. An appealing opportunity of these environments is the potential to deliver useful information to individuals either from their previous information experiences or external sources. This information should enrich their life experiences or make them more effective in their endeavours. Information access in ubiquitous computing environments can be made "context-aware" by exploiting the wide range context data available describing the environment, the searcher and the information itself. Realizing such a vision of reliable, timely and appropriate identification and delivery of information in this way poses numerous challenges. A central theme in achieving context-aware information access is the combination of information retrieval with multiple dimensions of available context data. Potential context data sources, include the user's current task, inputs from environmental and biometric sensors, associated with the user's current context, previous contexts, and document context, which can be exploited using a variety of technologies to create new and exciting possibilities for information access
Hierarchical Contextualized Representation for Named Entity Recognition
Named entity recognition (NER) models are typically based on the architecture
of Bi-directional LSTM (BiLSTM). The constraints of sequential nature and the
modeling of single input prevent the full utilization of global information
from larger scope, not only in the entire sentence, but also in the entire
document (dataset). In this paper, we address these two deficiencies and
propose a model augmented with hierarchical contextualized representation:
sentence-level representation and document-level representation. In
sentence-level, we take different contributions of words in a single sentence
into consideration to enhance the sentence representation learned from an
independent BiLSTM via label embedding attention mechanism. In document-level,
the key-value memory network is adopted to record the document-aware
information for each unique word which is sensitive to similarity of context
information. Our two-level hierarchical contextualized representations are
fused with each input token embedding and corresponding hidden state of BiLSTM,
respectively. The experimental results on three benchmark NER datasets
(CoNLL-2003 and Ontonotes 5.0 English datasets, CoNLL-2002 Spanish dataset)
show that we establish new state-of-the-art results.Comment: Accepted by AAAI 202
BERT-Embedding and Citation Network Analysis based Query Expansion Technique for Scholarly Search
The enormous growth of research publications has made it challenging for
academic search engines to bring the most relevant papers against the given
search query. Numerous solutions have been proposed over the years to improve
the effectiveness of academic search, including exploiting query expansion and
citation analysis. Query expansion techniques mitigate the mismatch between the
language used in a query and indexed documents. However, these techniques can
suffer from introducing non-relevant information while expanding the original
query. Recently, contextualized model BERT to document retrieval has been quite
successful in query expansion. Motivated by such issues and inspired by the
success of BERT, this paper proposes a novel approach called QeBERT. QeBERT
exploits BERT-based embedding and Citation Network Analysis (CNA) in query
expansion for improving scholarly search. Specifically, we use the
context-aware BERT-embedding and CNA for query expansion in Pseudo-Relevance
Feedback (PRF) fash-ion. Initial experimental results on the ACL dataset show
that BERT-embedding can provide a valuable augmentation to query expansion and
improve search relevance when combined with CNA.Comment: 1
- ā¦