67 research outputs found

    Generative Adversarial Networks for Financial Trading Strategies Fine-Tuning and Combination

    Get PDF
    Systematic trading strategies are algorithmic procedures that allocate assets aiming to optimize a certain performance criterion. To obtain an edge in a highly competitive environment, the analyst needs to proper fine-tune its strategy, or discover how to combine weak signals in novel alpha creating manners. Both aspects, namely fine-tuning and combination, have been extensively researched using several methods, but emerging techniques such as Generative Adversarial Networks can have an impact into such aspects. Therefore, our work proposes the use of Conditional Generative Adversarial Networks (cGANs) for trading strategies calibration and aggregation. To this purpose, we provide a full methodology on: (i) the training and selection of a cGAN for time series data; (ii) how each sample is used for strategies calibration; and (iii) how all generated samples can be used for ensemble modelling. To provide evidence that our approach is well grounded, we have designed an experiment with multiple trading strategies, encompassing 579 assets. We compared cGAN with an ensemble scheme and model validation methods, both suited for time series. Our results suggest that cGANs are a suitable alternative for strategies calibration and combination, providing outperformance when the traditional techniques fail to generate any alpha

    ALGA: Automatic Logic Gate Annotator for Building Financial News Events Detectors

    Get PDF
    We present a new automatic data labelling framework called ALGA - Automatic Logic Gate Annotator. The framework helps to create large amounts of annotated data for training domain-specific financial news events detection classifiers quicker. ALGA framework implements a rules-based approach to annotate a training dataset. This method has following advantages: 1) unlike traditional data labelling methods, it helps to filter relevant news articles from noise; 2) allows easier transferability to other domains and better interpretability of models trained on automatically labelled data. To create this framework, we focus on the U.S.-based companies that operate in the Apparel and Footwear industry. We show that event detection classifiers trained on the data generated by our framework can achieve state-of-the-art performance in the domain-specific financial events detection task. Besides, we create a domain-specific events synonyms dictionary

    Quantifying the digital traces of Hurricane Sandy on Flickr

    Get PDF
    Society’s increasing interactions with technology are creating extensive “digital traces” of our collective human behavior. These new data sources are fuelling the rapid development of the new field of computational social science. To investigate user attention to the Hurricane Sandy disaster in 2012, we analyze data from Flickr, a popular website for sharing personal photographs. In this case study, we find that the number of photos taken and subsequently uploaded to Flickr with titles, descriptions or tags related to Hurricane Sandy bears a striking correlation to the atmospheric pressure in the US state New Jersey during this period. Appropriate leverage of such information could be useful to policy makers and others charged with emergency crisis management

    Limit Order Book Simulations: A Review

    Full text link
    Limit Order Books (LOBs) serve as a mechanism for buyers and sellers to interact with each other in the financial markets. Modelling and simulating LOBs is quite often necessary for calibrating and fine-tuning the automated trading strategies developed in algorithmic trading research. The recent AI revolution and availability of faster and cheaper compute power has enabled the modelling and simulations to grow richer and even use modern AI techniques. In this review we examine the various kinds of LOB simulation models present in the current state of the art. We provide a classification of the models on the basis of their methodology and provide an aggregate view of the popular stylized facts used in the literature to test the models. We additionally provide a focused study of price impact's presence in the models since it is one of the more crucial phenomena to model in algorithmic trading. Finally, we conduct a comparative analysis of various qualities of fits of these models and how they perform when tested against empirical data.Comment: To be submitted to Quantitative Financ

    Decentralized Token Economy Theory (DeTEcT)

    Full text link
    This paper presents a pioneering approach for simulation of economic activity, policy implementation, and pricing of goods in token economies. The paper proposes a formal analysis framework for wealth distribution analysis and simulation of interactions between economic participants in an economy. Using this framework, we define a mechanism for identifying prices that achieve the desired wealth distribution according to some metric, and stability of economic dynamics. The motivation to study tokenomics theory is the increasing use of tokenization, specifically in financial infrastructures, where designing token economies is in the forefront. Tokenomics theory establishes a quantitative framework for wealth distribution amongst economic participants and implements the algorithmic regulatory controls mechanism that reacts to changes in economic conditions.\par In our framework, we introduce a concept of tokenomic taxonomy where agents in the economy are categorized into agent types and interactions between them. This novel approach is motivated by having a generalized model of the macroeconomy with controls being implemented through interactions and policies. The existence of such controls allows us to measure and readjust the wealth dynamics in the economy to suit the desired objectives.Comment: 24 pages, 5 figures, 8 table
    corecore