3,561 research outputs found

    Convolutional neural networks applied to high-frequency market microstructure forecasting

    Get PDF
    Highly sophisticated artificial neural networks have achieved unprecedented performance across a variety of complex real-world problems over the past years, driven by the ability to detect significant patterns autonomously. Modern electronic stock markets produce large volumes of data, which are very suitable for use with these algorithms. This research explores new scientific ground by designing and evaluating a convolutional neural network in predicting future financial outcomes. A visually inspired transformation process translates high-frequency market microstructure data from the London Stock Exchange into four market-event based input channels, which are used to train six deep networks. Primary results indicate that con-volutional networks behave reasonably well on this task and extract interesting microstructure patterns, which are in line with previous theoretical findings. Furthermore, it demonstrates a new approach using modern deep-learning techniques for exploiting and analysing market microstructure behaviour

    Statistical Physics and Representations in Real and Artificial Neural Networks

    Full text link
    This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics

    Time series prediction and forecasting using Deep learning Architectures

    Get PDF
    Nature brings time series data everyday and everywhere, for example, weather data, physiological signals and biomedical signals, financial and business recordings. Predicting the future observations of a collected sequence of historical observations is called time series forecasting. Forecasts are essential, considering the fact that they guide decisions in many areas of scientific, industrial and economic activity such as in meteorology, telecommunication, finance, sales and stock exchange rates. A massive amount of research has already been carried out by researchers over many years for the development of models to improve the time series forecasting accuracy. The major aim of time series modelling is to scrupulously examine the past observation of time series and to develop an appropriate model which elucidate the inherent behaviour and pattern existing in time series. The behaviour and pattern related to various time series may possess different conventions and infact requires specific countermeasures for modelling. Consequently, retaining the neural networks to predict a set of time series of mysterious domain remains particularly challenging. Time series forecasting remains an arduous problem despite the fact that there is substantial improvement in machine learning approaches. This usually happens due to some factors like, different time series may have different flattering behaviour. In real world time series data, the discriminative patterns residing in the time series are often distorted by random noise and affected by high-frequency perturbations. The major aim of this thesis is to contribute to the study and expansion of time series prediction and multistep ahead forecasting method based on deep learning algorithms. Time series forecasting using deep learning models is still in infancy as compared to other research areas for time series forecasting.Variety of time series data has been considered in this research. We explored several deep learning architectures on the sequential data, such as Deep Belief Networks (DBNs), Stacked AutoEncoders (SAEs), Recurrent Neural Networks (RNNs) and Convolutional Neural Networks (CNNs). Moreover, we also proposed two different new methods based on muli-step ahead forecasting for time series data. The comparison with state of the art methods is also exhibited. The research work conducted in this thesis makes theoretical, methodological and empirical contributions to time series prediction and multi-step ahead forecasting by using Deep Learning Architectures

    Comparing Models for Time Series Analysis

    Get PDF
    Historically, traditional methods such as Autoregressive Integrated Moving Average (ARIMA) have played an important role for researchers studying time series data. Recently, as advances in computer science and machine learning have gained widespread attention, researchers of time series analysis have brought new techniques to the table. In this paper, we examine the performance difference between ARIMA and a relatively recent development in the machine learning community called Long-Short Term Memory Networks (LSTM). Whereas many traditional methods assume the existence of an underlying stochastic model, these algorithmic approaches make no claims about the generation process. Our primary measure of performance is how well each model forecasts out-of-sample data. We find that data with strong seasonal structure are forecast comparatively well by either method. On the other hand, without strong seasonality, there is very little information that can be extracted and both methods tend to perform poorly in forecasting
    corecore