2,603 research outputs found
Stock Market Prediction via Deep Learning Techniques: A Survey
The stock market prediction has been a traditional yet complex problem
researched within diverse research areas and application domains due to its
non-linear, highly volatile and complex nature. Existing surveys on stock
market prediction often focus on traditional machine learning methods instead
of deep learning methods. Deep learning has dominated many domains, gained much
success and popularity in recent years in stock market prediction. This
motivates us to provide a structured and comprehensive overview of the research
on stock market prediction focusing on deep learning techniques. We present
four elaborated subtasks of stock market prediction and propose a novel
taxonomy to summarize the state-of-the-art models based on deep neural networks
from 2011 to 2022. In addition, we also provide detailed statistics on the
datasets and evaluation metrics commonly used in the stock market. Finally, we
highlight some open issues and point out several future directions by sharing
some new perspectives on stock market prediction
On the Evolution of Knowledge Graphs: A Survey and Perspective
Knowledge graphs (KGs) are structured representations of diversified
knowledge. They are widely used in various intelligent applications. In this
article, we provide a comprehensive survey on the evolution of various types of
knowledge graphs (i.e., static KGs, dynamic KGs, temporal KGs, and event KGs)
and techniques for knowledge extraction and reasoning. Furthermore, we
introduce the practical applications of different types of KGs, including a
case study in financial analysis. Finally, we propose our perspective on the
future directions of knowledge engineering, including the potential of
combining the power of knowledge graphs and large language models (LLMs), and
the evolution of knowledge extraction, reasoning, and representation
A Novel Distributed Representation of News (DRNews) for Stock Market Predictions
In this study, a novel Distributed Representation of News (DRNews) model is
developed and applied in deep learning-based stock market predictions. With the
merit of integrating contextual information and cross-documental knowledge, the
DRNews model creates news vectors that describe both the semantic information
and potential linkages among news events through an attributed news network.
Two stock market prediction tasks, namely the short-term stock movement
prediction and stock crises early warning, are implemented in the framework of
the attention-based Long Short Term-Memory (LSTM) network. It is suggested that
DRNews substantially enhances the results of both tasks comparing with five
baselines of news embedding models. Further, the attention mechanism suggests
that short-term stock trend and stock market crises both receive influences
from daily news with the former demonstrates more critical responses on the
information related to the stock market {\em per se}, whilst the latter draws
more concerns on the banking sector and economic policies.Comment: 25 page
Multivariate Realized Volatility Forecasting with Graph Neural Network
The existing publications demonstrate that the limit order book data is
useful in predicting short-term volatility in stock markets. Since stocks are
not independent, changes on one stock can also impact other related stocks. In
this paper, we are interested in forecasting short-term realized volatility in
a multivariate approach based on limit order book data and relational data. To
achieve this goal, we introduce Graph Transformer Network for Volatility
Forecasting. The model allows to combine limit order book features and an
unlimited number of temporal and cross-sectional relations from different
sources. Through experiments based on about 500 stocks from S&P 500 index, we
find a better performance for our model than for other benchmarks.Comment: 13 pages, 6 tables, 4 figure
Robustness, Heterogeneity and Structure Capturing for Graph Representation Learning and its Application
Graph neural networks (GNNs) are potent methods for graph representation learn- ing (GRL), which extract knowledge from complicated (graph) structured data in various real-world scenarios. However, GRL still faces many challenges. Firstly GNN-based node classification may deteriorate substantially by overlooking the pos- sibility of noisy data in graph structures, as models wrongly process the relation among nodes in the input graphs as the ground truth. Secondly, nodes and edges have different types in the real-world and it is essential to capture this heterogeneity in graph representation learning. Next, relations among nodes are not restricted to pairwise relations and it is necessary to capture the complex relations accordingly. Finally, the absence of structural encodings, such as positional information, deterio- rates the performance of GNNs. This thesis proposes novel methods to address the aforementioned problems:
1. Bayesian Graph Attention Network (BGAT): Developed for situations with scarce data, this method addresses the influence of spurious edges. Incor- porating Bayesian principles into the graph attention mechanism enhances robustness, leading to competitive performance against benchmarks (Chapter 3).
2. Neighbour Contrastive Heterogeneous Graph Attention Network (NC-HGAT): By enhancing a cutting-edge self-supervised heterogeneous graph neural net- work model (HGAT) with neighbour contrastive learning, this method ad- dresses heterogeneity and uncertainty simultaneously. Extra attention to edge relations in heterogeneous graphs also aids in subsequent classification tasks (Chapter 4).
3. A novel ensemble learning framework is introduced for predicting stock price movements. It adeptly captures both group-level and pairwise relations, lead- ing to notable advancements over the existing state-of-the-art. The integration of hypergraph and graph models, coupled with the utilisation of auxiliary data via GNNs before recurrent neural network (RNN), provides a deeper under- standing of long-term dependencies between similar entities in multivariate time series analysis (Chapter 5).
4. A novel framework for graph structure learning is introduced, segmenting graphs into distinct patches. By harnessing the capabilities of transformers and integrating other position encoding techniques, this approach robustly capture intricate structural information within a graph. This results in a more comprehensive understanding of its underlying patterns (Chapter 6)
- …