2,375 research outputs found

    Information Aggregation in Exponential Family Markets

    Full text link
    We consider the design of prediction market mechanisms known as automated market makers. We show that we can design these mechanisms via the mold of \emph{exponential family distributions}, a popular and well-studied probability distribution template used in statistics. We give a full development of this relationship and explore a range of benefits. We draw connections between the information aggregation of market prices and the belief aggregation of learning agents that rely on exponential family distributions. We develop a very natural analysis of the market behavior as well as the price equilibrium under the assumption that the traders exhibit risk aversion according to exponential utility. We also consider similar aspects under alternative models, such as when traders are budget constrained

    Axioms for Constant Function Market Makers

    Full text link
    We study axiomatic foundations for different classes of constant-function automated market makers (CFMMs). We focus particularly on separability and on different invariance properties under scaling. Our main results are an axiomatic characterization of a natural generalization of constant product market makers (CPMMs), popular in decentralized finance, on the one hand, and a characterization of the Logarithmic Scoring Rule Market Makers (LMSR), popular in prediction markets, on the other hand. The first class is characterized by the combination of independence and scale invariance, whereas the second is characterized by the combination of independence and translation invariance. The two classes are therefore distinguished by a different invariance property that is motivated by different interpretations of the num\'eraire in the two applications. However, both are pinned down by the same separability property. Moreover, we characterize the CPMM as an extremal point within the class of scale invariant, independent, symmetric AMMs with non-concentrated liquidity provision. Our results add to a formal analysis of mechanisms that are currently used for decentralized exchanges and connect the most popular class of DeFi AMMs to the most popular class of prediction market AMMs

    Generalised Mixability, Constant Regret, and Bayesian Updating

    Full text link
    Mixability of a loss is known to characterise when constant regret bounds are achievable in games of prediction with expert advice through the use of Vovk's aggregating algorithm. We provide a new interpretation of mixability via convex analysis that highlights the role of the Kullback-Leibler divergence in its definition. This naturally generalises to what we call Ί\Phi-mixability where the Bregman divergence DΊD_\Phi replaces the KL divergence. We prove that losses that are Ί\Phi-mixable also enjoy constant regret bounds via a generalised aggregating algorithm that is similar to mirror descent.Comment: 12 page

    Securities trading in multiple markets: the Chinese perspective

    Get PDF
    This thesis studies the trading of the Chinese American Depositories Receipts (ADRs) and their respective underlying H shares issued in Hong Kong. The primary intention of this work is to investigate the arbitrage opportunity between the Chinese ADRs and their underlying H shares. This intention is motivated by the market observation that hedge funds are often in the top 10 shareholders of these Chinese ADRs. We start our study from the origin place of the Chinese ADRs, China’s stock market. We pay particular attention to the ownership structure of the Chinese listed firms, because part of the Chinese ADRs also listed A shares (exclusively owned by the Chinese citizens) in Shanghai. We also pay attention to the market microstructures and trading costs of the three China-related stock exchanges. We then proceed to empirical study on the Chinese ADRs arbitrage possibility by comparing the return distribution of two securities; we find these two securities are different in their return distributions, and which is due to the inequality in the higher moments, such as skewness, and kurtosis. Based on the law of one price and the weak-form efficient markets, the prices of identical securities that are traded in different markets should be similar, as any deviation in their prices will be arbitraged away. Given the intrinsic property of the ADRs that a convenient transferable mechanism exists between the ADRs and their underlying shares which makes arbitrage easy; the different return distributions of the ADRs and the underlying shares address the question that if arbitrage is costly that the equilibrium price of the security achieved in each market is affected mainly by its local market where the Chinese ADRs/the underlying Hong Kong shares are traded, such as the demand for and the supply of the stock in each market, the different market microstructures and market mechanisms which produce different trading costs in each market, and different noise trading arose from asymmetric information across multi-markets. And because of these trading costs, noise trading risk, and liquidity risk, the arbitrage opportunity between the two markets would not be exploited promptly. This concern then leads to the second intention of this work that how noise trading and trading cost comes into playing the role of determining asset prices, which makes us to empirically investigate the comovement effect, as well as liquidity risk. With regards to these issues, we progress into two strands, firstly, we test the relationship between the price differentials of the Chinese ADRs and the market return of the US and Hong Kong market. This test is to examine the comovement effect which is caused by asynchronous noise trading. We find the US market impact dominant over Hong Kong market impact, though both markets display significant impact on the ADRs’ price differentials. Secondly, we analyze the liquidity effect on the Chinese ADRs and their underlying Hong Kong shares by using two proxies to measure illiquidity cost and liquidity risk. We find significant positive relation between return and trading volume which is used to capture liquidity risk. This finding leads to a deeper study on the relationship between trading volume and return volatility from market microstructure perspective. In order to verify a proper model to describe return volatility, we carry out test to examine the heteroscedasticity condition, and proceed to use two asymmetric GARCH models to capture leverage effect. We find the Chinese ADRs and their underlying Hong Kong shares have different patterns in the leverage effect as modeled by these two asymmetric GARCH models, and this finding from another angle explains why these two securities are unequal in the higher moments of their return distribution. We then test two opposite hypotheses about volume-volatility relation. The Mixture of Distributions Hypothesis suggests a positive relation between contemporaneous volume and volatility, while the Sequential Information Arrival Hypothesis indicates a causality relationship between lead-lag volume and volatility. We find supportive evidence for the Sequential Information Arrival Hypothesis but not for the Mixture of Distributions Hypothesis

    The structure of derivatives exchanges : lessons from developed and emerging markets

    Get PDF
    The authors examine the architecture, elements of market design, and the products traded in derivatives exchanges around the world. The core function of a derivatives exchange is to facilitate the transfer of risk among economic agents by providing mechanisms to enhance liquidity and facilitate price discovery. They test the proposition that organizational arrangements necessary to perform this function are not the same across markets. They also examine the sequencing of products introduced in derivatives exchanges. Using a survey instrument, they find that: a) Financial systems perform the same core functions across time and place but institutional arrangements differ. b) The ownership structure of derivatives exchanges assumes different forms across markets. c) The success of an exchange depends on the structure adopted and the products traded. d) Exchanges are regulated directly or indirectly through a government law. In addition, exchanges have their own regulatory structure. e) Typically (but not always) market-making systems are based on open outcry, with daily mark-to-market and gross margining -- but electronic systems are gaining popularity. f) Several (but not all) exchanges own clearing facilities and use netting settlement procedures. As for derivative products traded, they find that: i) Although most of the older exchanges started with (mainly agricultural) commodity derivatives, newer exchanges first introduce financial derivative products. ii) Derivatives based on a domestic stock index have greater potential for success followed by derivatives based on local interest rates and currencies. iii) The introduction of derivatives contracts appears to take more time in emerging markets compared with developed markets, with the exception of index products.Economic Theory&Research,Payment Systems&Infrastructure,International Terrorism&Counterterrorism,Non Bank Financial Institutions,Environmental Economics&Policies,International Terrorism&Counterterrorism,Payment Systems&Infrastructure,Economic Theory&Research,Non Bank Financial Institutions,Environmental Economics&Policies

    Algorithmic trading, market quality and information : a dual -process account

    Get PDF
    One of the primary challenges encountered when conducting theoretical research on the subject of algorithmic trading is the wide array of strategies employed by practitioners. Current theoretical models treat algorithmic traders as a homogenous trader group, resulting in a gap between theoretical discourse and empirical evidence on algorithmic trading practices. In order to address this, the current study introduces an organisational framework from which to conceptualise and synthesise the vast amount of algorithmic trading strategies. More precisely, using the principles of contemporary cognitive science, it is argued that the dual process paradigm - the most prevalent contemporary interpretation of the nature and function of human decision making - lends itself well to a novel taxonomy of algorithmic trading. This taxonomy serves primarily as a heuristic to inform a theoretical market microstructure model of algorithmic trading. Accordingly, this thesis presents the first unified, all-inclusive theoretical model of algorithmic trading; the overall aim of which is to determine the evolving nature of financial market quality as a consequence of this practice. In accordance with the literature on both cognitive science and algorithmic trading, this thesis espouses that there exists two distinct types of algorithmic trader; one (System 1) having fast processing characteristics, and the other (System 2) having slower, more analytic or reflective processing characteristics. Concomitantly, the current microstructure literature suggests that a trader can be superiorly informed as a result of either (1) their superior speed in accessing or exploiting information, or (2) their superior ability to more accurately forecast future variables. To date, microstructure models focus on either one aspect but not both. This common modelling assumption is also evident in theoretical models of algorithmic trading. Theoretical papers on the topic have coalesced around the idea that algorithmic traders possess a comparative advantage relative to their human counterparts. However, the literature is yet to reach consensus as to what this advantage entails, nor its subsequent effects on financial market quality. Notably, the key assumptions underlying the dual-process taxonomy of algorithmic trading suggest that two distinct informational advantages underlie algorithmic trading. The possibility then follows that System 1 algorithmic traders possess an inherent speed advantage and System 2 algorithmic traders, an inherent accuracy advantage. Inevitably, the various strategies associated with algorithmic trading correspond to their own respective system, and by implication, informational advantage. A model that incorporates both types of informational advantage is a challenging problem in the context of a microstructure model of trade. Models typically eschew this issue entirely by restricting themselves to the analysis of one type of information variable in isolation. This is done solely for the sake of tractability and simplicity (models can in theory include both variables). Thus, including both types of private information within a single microstructure model serves to enhance the novel contribution of this work. To prepare for the final theoretical model of this thesis, the present study will first conjecture and verify a benchmark model with only one type/system of algorithmic trader. More formally, iv a System 2 algorithmic trader will be introduced into Kyle’s (1985) static Bayesian Nash Equilibrium (BNE) model. The behavioral and informational characteristics of this agent emanate from the key assumptions reflected in the taxonomy. The final dual-process microstructure model, presented in the concluding chapter of this thesis, extends the benchmark model (which builds on Kyle (1985)) by introducing the System 1 algorithmic trader; thereby, incorporating both algorithmic trader systems. As said above: the benchmark model nests the Kyle (1985) model. In a limiting case of the benchmark model, where the System 2 algorithmic trader does not have access to this particular form of private information, the equilibrium reduces to the equilibrium of the static model of Kyle (1985). Likewise, in the final model, when the System 1 algorithmic trader’s information is negligible, the model collapses to the benchmark model. Interestingly, this thesis was able to determine how the strategic interplay between two differentially informed algorithmic traders impact market quality over time. The results indicate that a disparity exists between each distinctive algorithmic trading system and its relative impact on financial market quality. The unique findings of this thesis are addressed in the concluding chapter. Empirical implications of the final model will also be discussed.GR201

    Broker identity and market quality: does it worth to reveal yourself after a trade?

    Get PDF
    La tesi si propone di studiare l'impatto che l'introduzione dell'anonimato post-trade, avvenuta nel 2004 nel nuovo mercato di borsa italiana, ha avuto sulla qualitĂ  del mercato ed in particolare sulla liquidit
    • 

    corecore