125 research outputs found

    Market Design for the Transition to Renewable Electricity Systems

    Get PDF
    The research carried out in this thesis aims to shed light on the role of the European electricity market design in the transition to a target electricity system that combines sustainability, affordability, and reliability. While the ongoing expansion of fluctuating renewable electricity sources challenges the established structures and market mechanisms, governments across Europe have decided to phase-out certain conventional technologies like coal or nuclear power. Since traditional electricity systems rely on flexibility provided by controllable generation capacity, other flexibility options are needed to compensate for the decommissioned conventional power plants and support the system integration of renewables. Against this background, the dissertation extends an established large-scale agent-based electricity market model in order to account for the developments towards an integrated European electricity market and the characteristics of storage technologies. In particular, the representation of cross-border effects is enhanced by integrating approaches from the fields of operations research, non-cooperative game theory, and artificial intelligence in the simulation framework. The extended model is then applied in three case studies to analyze the diffusion of different flexibility options under varying regulatory settings. These case studies cover some central aspects of the European electricity market, most importantly capacity remuneration mechanisms, the interaction of day-ahead market and congestion management, and the role of regulation for residential self-consumption. Results of the case studies confirm that by designing the regulatory framework, policymakers and regulators can substantially affect quantity, composition, location, and operation of technologies – both, on the supply side and the demand side. At the same time, changes and amendments to market design are frequent and will continue to be so in the years ahead. Moreover, given the increasing level of market integration in Europe, the role of cross-border effects of national market designs will gain further in importance. In this context, agent-based simulation models are a valuable tool to better understand potential long-term effects of market designs in the interconnected European electricity system and can therefore support the European energy transition

    BNAIC 2008:Proceedings of BNAIC 2008, the twentieth Belgian-Dutch Artificial Intelligence Conference

    Get PDF

    Three Essays in Financial Economics

    Get PDF
    This thesis encompasses three essays, each of which examines the role of information in a specific setting arising in financial economics. Thus, each essay contributes to the literature about the role of information in financial markets and to the debate whether financial markets are efficient or not. The first essay investigates return patterns around news events by analyzing the largest news dataset studied in finance so far: 4.4 million news headlines between January 1996 and December 2019 on firms listed in North America. I use a finance-specific sentiment dictionary to classify these news headlines into positive and negative ones and contrast the results of this approach to classifications based on supervised learning models trained on the market reaction to the news. These supervised learning models include the multinomial Naïve Bayes method and several rudimentary neural networks. This paper contributes to the literature by showing that supervised learning models outperform the sentiment dictionary approach traditionally used in finance. Furthermore, it provides evidence that financial markets anticipate news in the weeks ahead, that the new information is quickly priced in, and that there is no drift afterward. The second essay is a follow-up on the first: It merges the same news dataset with the data on the trades at the New York Stock Exchange (NYSE) between January 2011 and December 2019. This seemingly trivial process is technically challenging because the NYSE Trades and Quotes (TAQ) data contains every trade with a time stamp precision of nanoseconds, i.e., hundreds of millions of trades per trading day, and is several terabytes of raw data. The resulting dataset contains 2.3 million observations and covers the eight hours before and after news publication at the second frequency. I use the models from the first essay to classify the news into positive and negative, show return patterns for the two types of news, and test trading strategies that react instantly to the new information. This study contributes to the literature by providing multiple pieces of evidence that support the efficient market hypothesis at high frequencies. First, I confirm the finding of the first essay that financial markets anticipate new information and show that they react instantly and price in the new information within minutes. Second, the tested trading strategies yield surprisingly low returns. Finally, the average return and volatility patterns around all news are highly consistent with rational pricing: The elevated volatility before news expresses the uncertainty about the content of the news (markets usually know that information is coming because the firms often schedule a news release, but they do not know the content) and I show that holding stocks in the 6.5 hours (one trading day) before news yields an average excess return of 0.1%. The third essay tests multiple measures of profitability and their trend regarding their ability to predict stock returns. Because such predictors challenge the Efficient Market Hypothesis (EMH), they are called anomalies. Reviewing these anomalies is necessary because they sometimes change over time and many disappear when analyzing a different period (especially post-publication) or, even worse, when just applying proper asset pricing tests. I cover the period from June 1980 to December 2021 and mainly analyze six different profitability measures concerning their level, trend, and level relative to the industry’s mean. This paper makes multiple contributions to the literature. First, I show that the trend-of-profitability effect described in Akbas et al. (2017) is mainly driven by the period 2000 to 2006 and has been reversing since then. This finding is also robust against slight changes in their methodology. Second, I confirm that the cleaning of Compustat’s Selling, General and Administrative (SG&A) cost variable by re-adding Research and Development (R&D) expenses, as described in Ball et al (2015), improves not only their profitability measure but also the one used in Fama and French (2015). Third, I show that the difference to the industry’s mean yields strong results in Fama-MacBeth regressions; however, it does not translate into high value-weighted portfolio returns and therefore lags the absolute level of profitability as a predictor for future returns. Fourth, I propose to use a different measure of value compared to the popular book equity-to-market equity ratio, namely Gross Profit (GP) minus SG&A divided by market equity because the first seems to have lost its predictive power while the latter has not and was also a stronger predictor before. This measure of value is an ideal complement to measures of profitability. Collectively, this thesis contributes to the debate on market efficiency and stock return predictability. The first essay finds that financial markets already anticipate news in the weeks ahead, price them in quickly, and that individual news is not a medium- or long-term return predictor. The second essay shows that this pattern can also be found at the intraday level, that most of the new information is priced in immediately, and that the average drift afterward is minimal and only lasts a few minutes. The third essay contrasts the evidence of market efficiency from the first two with long-term return predictability based on profitability measures. While the third essay is not proof against market efficiency because there are theoretical risk-related explanations for excess returns of highly profitable stocks, I consider them at least questionable. Furthermore, I would also like to highlight that two of the risk factors of Fama and French, namely size and value, have had negative returns for over a decade. Should this persist, it raises the question of whether markets were efficient and are not anymore, or if they were not and are now. From a behavioral pers pective, we as humans may be capable of correctly assessing the impact of individual new information but are ignorant about focusing on what is truly relevant in this ever-expanding sea of information

    Exceptional Model Mining

    Get PDF
    Finding subsets of a dataset that somehow deviate from the norm, i.e. where something interesting is going on, is a classical Data Mining task. In traditional local pattern mining methods, such deviations are measured in terms of a relatively high occurrence (frequent itemset mining), or an unusual distribution for one designated target attribute (subgroup discovery). These, however, do not encompass all forms of "interesting". To capture a more general notion of interestingness in subsets of a dataset, we develop Exceptional Model Mining (EMM). This is a supervised local pattern mining framework, where several target attributes are selected, and a model over these attributes is chosen to be the target concept. Then, subsets are sought on which this model is substantially different from the model on the whole dataset. For instance, we can find parts of the data where two target attributes have an unusual correlation, a classifier has a deviating predictive performance, or a Bayesian network fitted on several target attributes has an exceptional structure. We will discuss some real-world applications of EMM instances, including using the Bayesian network model to identify meteorological conditions under which food chains are displaced, and using a regression model to find the subset of households in the Chinese province of Hunan that do not follow the general economic law of demand.This research is supported by the Netherlands Organisation for Scientific Research (NWO) under project number 612.065.822 (Exceptional Model Mining).Algorithms and the Foundations of Software technolog

    Proceedings of the 18th Irish Conference on Artificial Intelligence and Cognitive Science

    Get PDF
    These proceedings contain the papers that were accepted for publication at AICS-2007, the 18th Annual Conference on Artificial Intelligence and Cognitive Science, which was held in the Technological University Dublin; Dublin, Ireland; on the 29th to the 31st August 2007. AICS is the annual conference of the Artificial Intelligence Association of Ireland (AIAI)

    On the Combination of Game-Theoretic Learning and Multi Model Adaptive Filters

    Get PDF
    This paper casts coordination of a team of robots within the framework of game theoretic learning algorithms. In particular a novel variant of fictitious play is proposed, by considering multi-model adaptive filters as a method to estimate other players’ strategies. The proposed algorithm can be used as a coordination mechanism between players when they should take decisions under uncertainty. Each player chooses an action after taking into account the actions of the other players and also the uncertainty. Uncertainty can occur either in terms of noisy observations or various types of other players. In addition, in contrast to other game-theoretic and heuristic algorithms for distributed optimisation, it is not necessary to find the optimal parameters a priori. Various parameter values can be used initially as inputs to different models. Therefore, the resulting decisions will be aggregate results of all the parameter values. Simulations are used to test the performance of the proposed methodology against other game-theoretic learning algorithms.</p
    • …
    corecore