26,326 research outputs found
Improving Implicit Sentiment Learning via Local Sentiment Aggregation
Recent well-known works demonstrate encouraging progress in aspect-based
sentiment classification (ABSC), while implicit aspect sentiment modeling is
still a problem that has to be solved. Our preliminary study shows that
implicit aspect sentiments usually depend on adjacent aspects' sentiments,
which indicates we can extract implicit sentiment via local sentiment
dependency modeling. We formulate a local sentiment aggregation paradigm (LSA)
based on empirical sentiment patterns (SP) to address sentiment dependency
modeling. Compared to existing methods, LSA is an efficient approach that
learns the implicit sentiments in a local sentiment aggregation window, which
tackles the efficiency problem and avoids the token-node alignment problem of
syntax-based methods. Furthermore, we refine a differential weighting method
based on gradient descent that guides the construction of the sentiment
aggregation window. According to experimental results, LSA is effective for all
objective ABSC models, attaining state-of-the-art performance on three public
datasets. LSA is an adaptive paradigm and is ready to be adapted to existing
models, and we release the code to offer insight to improve existing ABSC
models.Comment: Source Code: https://github.com/yangheng95/PyABS
Joint Learning of Local and Global Features for Aspect-based Sentiment Classification
Aspect-based sentiment classification (ASC) aims to judge the sentiment
polarity conveyed by the given aspect term in a sentence. The sentiment
polarity is not only determined by the local context but also related to the
words far away from the given aspect term. Most recent efforts related to the
attention-based models can not sufficiently distinguish which words they should
pay more attention to in some cases. Meanwhile, graph-based models are coming
into ASC to encode syntactic dependency tree information. But these models do
not fully leverage syntactic dependency trees as they neglect to incorporate
dependency relation tag information into representation learning effectively.
In this paper, we address these problems by effectively modeling the local and
global features. Firstly, we design a local encoder containing: a Gaussian mask
layer and a covariance self-attention layer. The Gaussian mask layer tends to
adjust the receptive field around aspect terms adaptively to deemphasize the
effects of unrelated words and pay more attention to local information. The
covariance self-attention layer can distinguish the attention weights of
different words more obviously. Furthermore, we propose a dual-level graph
attention network as a global encoder by fully employing dependency tag
information to capture long-distance information effectively. Our model
achieves state-of-the-art performance on both SemEval 2014 and Twitter
datasets.Comment: under revie
Recommended from our members
OBOME - Ontology based opinion mining in UBIPOL
Ontologies have a special role in the UBIPOL system, they help to structure the policy related context, provide conceptualization for policy domain and use in the opinion mining process. In this work we presented a system called Ontology Based Opinion Mining Engine (OBOME) for analyzing a domain-specific opinion corpus by first assisting the user with the creation of a domain ontology from the corpus. We determined the polarity of opinion on the various domain aspects. In the former step, the policy domain aspect has are identified (namely which policy category is represented by the concept). This identification is supported by the policy modelling ontology, which describe the most important policy – related classes and structure. Then the most informative documents from the corpus are extracted and asked the user to create a set of aspects and related keywords using these documents. In the latter step, we used the corpus specific ontology to model the domain and extracted aspect-polarity associations using grammatical dependencies between words. Later, summarized results are shown to the user to analyze and store. Finally, in an offline process policy modeling ontology is updated
Syntax-aware Hybrid prompt model for Few-shot multi-modal sentiment analysis
Multimodal Sentiment Analysis (MSA) has been a popular topic in natural
language processing nowadays, at both sentence and aspect level. However, the
existing approaches almost require large-size labeled datasets, which bring
about large consumption of time and resources. Therefore, it is practical to
explore the method for few-shot sentiment analysis in cross-modalities.
Previous works generally execute on textual modality, using the prompt-based
methods, mainly two types: hand-crafted prompts and learnable prompts. The
existing approach in few-shot multi-modality sentiment analysis task has
utilized both methods, separately. We further design a hybrid pattern that can
combine one or more fixed hand-crafted prompts and learnable prompts and
utilize the attention mechanisms to optimize the prompt encoder. The
experiments on both sentence-level and aspect-level datasets prove that we get
a significant outperformance
Dual-Attention Model for Aspect-Level Sentiment Classification
I propose a novel dual-attention model(DAM) for aspect-level sentiment
classification. Many methods have been proposed, such as support vector
machines for artificial design features, long short-term memory networks based
on attention mechanisms, and graph neural networks based on dependency parsing.
While these methods all have decent performance, I think they all miss one
important piece of syntactic information: dependency labels. Based on this
idea, this paper proposes a model using dependency labels for the attention
mechanism to do this task. We evaluate the proposed approach on three datasets:
laptop and restaurant are from SemEval 2014, and the last one is a twitter
dataset. Experimental results show that the dual attention model has good
performance on all three datasets.Comment: 9 pages, 5 figure
- …