15,287 research outputs found

    Theory Morphisms in Church's Type Theory with Quotation and Evaluation

    Full text link
    CTTqe{\rm CTT}_{\rm qe} is a version of Church's type theory with global quotation and evaluation operators that is engineered to reason about the interplay of syntax and semantics and to formalize syntax-based mathematical algorithms. CTTuqe{\rm CTT}_{\rm uqe} is a variant of CTTqe{\rm CTT}_{\rm qe} that admits undefined expressions, partial functions, and multiple base types of individuals. It is better suited than CTTqe{\rm CTT}_{\rm qe} as a logic for building networks of theories connected by theory morphisms. This paper presents the syntax and semantics of CTTuqe{\rm CTT}_{\rm uqe}, defines a notion of a theory morphism from one CTTuqe{\rm CTT}_{\rm uqe} theory to another, and gives two simple examples that illustrate the use of theory morphisms in CTTuqe{\rm CTT}_{\rm uqe}.Comment: 17 page

    Formalizing Mathematical Knowledge as a Biform Theory Graph: A Case Study

    Full text link
    A biform theory is a combination of an axiomatic theory and an algorithmic theory that supports the integration of reasoning and computation. These are ideal for formalizing algorithms that manipulate mathematical expressions. A theory graph is a network of theories connected by meaning-preserving theory morphisms that map the formulas of one theory to the formulas of another theory. Theory graphs are in turn well suited for formalizing mathematical knowledge at the most convenient level of abstraction using the most convenient vocabulary. We are interested in the problem of whether a body of mathematical knowledge can be effectively formalized as a theory graph of biform theories. As a test case, we look at the graph of theories encoding natural number arithmetic. We used two different formalisms to do this, which we describe and compare. The first is realized in CTTuqe{\rm CTT}_{\rm uqe}, a version of Church's type theory with quotation and evaluation, and the second is realized in Agda, a dependently typed programming language.Comment: 43 pages; published without appendices in: H. Geuvers et al., eds, Intelligent Computer Mathematics (CICM 2017), Lecture Notes in Computer Science, Vol. 10383, pp. 9-24, Springer, 201

    The virtues and vices of equilibrium and the future of financial economics

    Get PDF
    The use of equilibrium models in economics springs from the desire for parsimonious models of economic phenomena that take human reasoning into account. This approach has been the cornerstone of modern economic theory. We explain why this is so, extolling the virtues of equilibrium theory; then we present a critique and describe why this approach is inherently limited, and why economics needs to move in new directions if it is to continue to make progress. We stress that this shouldn't be a question of dogma, but should be resolved empirically. There are situations where equilibrium models provide useful predictions and there are situations where they can never provide useful predictions. There are also many situations where the jury is still out, i.e., where so far they fail to provide a good description of the world, but where proper extensions might change this. Our goal is to convince the skeptics that equilibrium models can be useful, but also to make traditional economists more aware of the limitations of equilibrium models. We sketch some alternative approaches and discuss why they should play an important role in future research in economics.Comment: 68 pages, one figur

    The long memory of the efficient market

    Full text link
    For the London Stock Exchange we demonstrate that the signs of orders obey a long-memory process. The autocorrelation function decays roughly as τ−α\tau^{-\alpha} with α≈0.6\alpha \approx 0.6, corresponding to a Hurst exponent H≈0.7H \approx 0.7. This implies that the signs of future orders are quite predictable from the signs of past orders; all else being equal, this would suggest a very strong market inefficiency. We demonstrate, however, that fluctuations in order signs are compensated for by anti-correlated fluctuations in transaction size and liquidity, which are also long-memory processes. This tends to make the returns whiter. We show that some institutions display long-range memory and others don't.Comment: 19 pages, 12 figure

    An empirical behavioral model of price formation

    Full text link
    Although behavioral economics has demonstrated that there are many situations where rational choice is a poor empirical model, it has so far failed to provide quantitative models of economic problems such as price formation. We make a step in this direction by developing empirical models that capture behavioral regularities in trading order placement and cancellation using data from the London Stock Exchange. For order placement we show that the probability of placing an order at a given price is well approximated by a Student distribution with less than two degrees of freedom, centered on the best quoted price. This result is surprising because it implies that trading order placement is symmetric, independent of the bid-ask spread, and the same for buying and selling. We also develop a crude but simple cancellation model that depends on the position of an order relative to the best price and the imbalance between buying and selling orders in the limit order book. These results are combined to construct a stochastic representative agent model, in which the orders and cancellations are described in terms of conditional probability distributions. This model is used to simulate price formation and the results are compared to real data from the London Stock Exchange. Without adjusting any parameters based on price data, the model produces good predictions for the magnitude and functional form of the distribution of returns and the bid-ask spread

    An empirical behavioral model of liquidity and volatility

    Full text link
    We develop a behavioral model for liquidity and volatility based on empirical regularities in trading order flow in the London Stock Exchange. This can be viewed as a very simple agent based model in which all components of the model are validated against real data. Our empirical studies of order flow uncover several interesting regularities in the way trading orders are placed and cancelled. The resulting simple model of order flow is used to simulate price formation under a continuous double auction, and the statistical properties of the resulting simulated sequence of prices are compared to those of real data. The model is constructed using one stock (AZN) and tested on 24 other stocks. For low volatility, small tick size stocks (called Group I) the predictions are very good, but for stocks outside Group I they are not good. For Group I, the model predicts the correct magnitude and functional form of the distribution of the volatility and the bid-ask spread, without adjusting any parameters based on prices. This suggests that at least for Group I stocks, the volatility and heavy tails of prices are related to market microstructure effects, and supports the hypothesis that, at least on short time scales, the large fluctuations of absolute returns are well described by a power law with an exponent that varies from stock to stock

    The price dynamics of common trading strategies

    Full text link
    A deterministic trading strategy can be regarded as a signal processing element that uses external information and past prices as inputs and incorporates them into future prices. This paper uses a market maker based method of price formation to study the price dynamics induced by several commonly used financial trading strategies, showing how they amplify noise, induce structure in prices, and cause phenomena such as excess and clustered volatility.Comment: 29 pages, 12 figure

    The dynamics of the leverage cycle

    Get PDF
    We present a simple agent-based model of a financial system composed of leveraged investors such as banks that invest in stocks and manage their risk using a Value-at-Risk constraint, based on historical observations of asset prices. The Value-at-Risk constraint implies that when perceived risk is low, leverage is high and vice versa, a phenomenon that has been dubbed pro-cyclical leverage. We show that this leads to endogenous irregular oscillations, in which gradual increases in stock prices and leverage are followed by drastic market collapses, i.e. a leverage cycle. This phenomenon is studied using simplified models that give a deeper understanding of the dynamics and the nature of the feedback loops and instabilities underlying the leverage cycle. We introduce a flexible leverage regulation policy in which it is possible to continuously tune from pro-cyclical to countercyclical leverage. When the policy is sufficiently countercyclical and bank risk is sufficiently low the endogenous oscillation disappears and prices go to a fixed point. While there is always a leverage ceiling above which the dynamics are unstable, countercyclical leverage can be used to raise the ceiling. We also study the impact on leverage cycles of direct, temporal control of the bank's riskiness via the bank's required Value-at-Risk quantile. Under such a rule the regulator relaxes the Value-at-Risk quantile following a negative stock price shock and tightens it following a positive shock. While such a policy rule can reduce the amplitude of leverage cycles, its effectiveness is highly dependent on the choice of parameters. Finally, we investigate fixed limits on leverage and show how they can control the leverage cycle.Comment: 35 pages, 9 figure
    • …
    corecore