22,079 research outputs found
Tensor Representation in High-Frequency Financial Data for Price Change Prediction
Nowadays, with the availability of massive amount of trade data collected,
the dynamics of the financial markets pose both a challenge and an opportunity
for high frequency traders. In order to take advantage of the rapid, subtle
movement of assets in High Frequency Trading (HFT), an automatic algorithm to
analyze and detect patterns of price change based on transaction records must
be available. The multichannel, time-series representation of financial data
naturally suggests tensor-based learning algorithms. In this work, we
investigate the effectiveness of two multilinear methods for the mid-price
prediction problem against other existing methods. The experiments in a large
scale dataset which contains more than 4 millions limit orders show that by
utilizing tensor representation, multilinear models outperform vector-based
approaches and other competing ones.Comment: accepted in SSCI 2017, typos fixe
A deep learning integrated Lee-Carter model
In the field of mortality, the Lee–Carter based approach can be considered the milestone
to forecast mortality rates among stochastic models. We could define a “Lee–Carter model family”
that embraces all developments of this model, including its first formulation (1992) that remains the
benchmark for comparing the performance of future models. In the Lee–Carter model, the kt parameter,
describing the mortality trend over time, plays an important role about the future mortality behavior.
The traditional ARIMA process usually used to model kt shows evident limitations to describe the future
mortality shape. Concerning forecasting phase, academics should approach a more plausible way in
order to think a nonlinear shape of the projected mortality rates. Therefore, we propose an alternative
approach the ARIMA processes based on a deep learning technique. More precisely, in order to catch
the pattern of kt series over time more accurately, we apply a Recurrent Neural Network with a Long
Short-Term Memory architecture and integrate the Lee–Carter model to improve its predictive capacity.
The proposed approach provides significant performance in terms of predictive accuracy and also allow
for avoiding the time-chunks’ a priori selection. Indeed, it is a common practice among academics to
delete the time in which the noise is overflowing or the data quality is insufficient. The strength of
the Long Short-Term Memory network lies in its ability to treat this noise and adequately reproduce it
into the forecasted trend, due to its own architecture enabling to take into account significant long-term
patterns
Bayesian Deep Net GLM and GLMM
Deep feedforward neural networks (DFNNs) are a powerful tool for functional
approximation. We describe flexible versions of generalized linear and
generalized linear mixed models incorporating basis functions formed by a DFNN.
The consideration of neural networks with random effects is not widely used in
the literature, perhaps because of the computational challenges of
incorporating subject specific parameters into already complex models.
Efficient computational methods for high-dimensional Bayesian inference are
developed using Gaussian variational approximation, with a parsimonious but
flexible factor parametrization of the covariance matrix. We implement natural
gradient methods for the optimization, exploiting the factor structure of the
variational covariance matrix in computation of the natural gradient. Our
flexible DFNN models and Bayesian inference approach lead to a regression and
classification method that has a high prediction accuracy, and is able to
quantify the prediction uncertainty in a principled and convenient way. We also
describe how to perform variable selection in our deep learning method. The
proposed methods are illustrated in a wide range of simulated and real-data
examples, and the results compare favourably to a state of the art flexible
regression and classification method in the statistical literature, the
Bayesian additive regression trees (BART) method. User-friendly software
packages in Matlab, R and Python implementing the proposed methods are
available at https://github.com/VBayesLabComment: 35 pages, 7 figure, 10 table
- …