387 research outputs found

    Un modèle de trafic adapté à la volatilité de charge d'un service de vidéo à la demande: Identification, validation et application à la gestion dynamique de ressources.

    Get PDF
    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this report we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. As an essential step we also derive a heuristic identification procedure to calibrate all the model parameters and evaluate the performance of our estimator on synthetic time series. We show how good can our model fit to real workload traces with respect to the stationary case in terms of steady-state probability and autocorrelation structure. We find that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by "buzz effects" that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networking.La gestion dynamique de ressources est un élément clé du paradigme de cloud computing et plus récemment de celui de cloud networking. Dans ce contexte d'infrastructures virtualisées, la réduction des coûts associés à l'utilisation et à la ré-allocation des ressources contraint les opé- rateurs et les utilisateurs de clouds à une gestion rationnelle de celles-ci. Dans ce travail nous proposons une description probabiliste des besoins liée à la volatilité de la charge d'un service de distribution de vidéos à la demande. Cette description peut alors servir de consigne (input) à la provision et à l'allocation dynamique des ressources nécessaires. Notre approche repose sur la construction d'un modèle stochastique inspiré des modèles de Markov standards de propaga- tion épidémiologique, capable de reproduire des variations soudaines et intenses d'activité (buzz). Nous proposons alors une procédure heuristique d'identification du modèle à partir de séries tem- porelles du nombre d'utilisateurs connectés au serveur. Les performances d'estimation de chacun des paramètres du modèle sont évaluées numériquement, et nous vérifions l'adéquation du modèle aux données en comparant les distributions des états stationnaires ainsi que les fonctions d'auto- corrélation des processus. Les propriétés markoviennes de notre modèle garantissent qu'il vérifie un principe de grandes dé- viations permettant de caractériser statistiquement l'ampleur et la durée d'évènements extrêmes et rares tels que ceux produits par les buzzs. C'est cette propriété que nous exploitons pour di- mensionner le volume de ressources (e.g. bande-passante, nombre de serveurs, taille de buffers) à prévoir pour réaliser un bon compromis entre coût de re-déploiement des infrastructures et qualité de service. Cette approche probabiliste de la gestion des ressources ouvre des perspectives sur les politiques de Service Level Agreement adaptées aux clouds et servant au mieux les intérêts des opérateurs de réseaux, de services et de leurs clients

    An empirical behavioral model of liquidity and volatility

    Full text link
    We develop a behavioral model for liquidity and volatility based on empirical regularities in trading order flow in the London Stock Exchange. This can be viewed as a very simple agent based model in which all components of the model are validated against real data. Our empirical studies of order flow uncover several interesting regularities in the way trading orders are placed and cancelled. The resulting simple model of order flow is used to simulate price formation under a continuous double auction, and the statistical properties of the resulting simulated sequence of prices are compared to those of real data. The model is constructed using one stock (AZN) and tested on 24 other stocks. For low volatility, small tick size stocks (called Group I) the predictions are very good, but for stocks outside Group I they are not good. For Group I, the model predicts the correct magnitude and functional form of the distribution of the volatility and the bid-ask spread, without adjusting any parameters based on prices. This suggests that at least for Group I stocks, the volatility and heavy tails of prices are related to market microstructure effects, and supports the hypothesis that, at least on short time scales, the large fluctuations of absolute returns are well described by a power law with an exponent that varies from stock to stock

    How markets slowly digest changes in supply and demand

    Full text link
    In this article we revisit the classic problem of tatonnement in price formation from a microstructure point of view, reviewing a recent body of theoretical and empirical work explaining how fluctuations in supply and demand are slowly incorporated into prices. Because revealed market liquidity is extremely low, large orders to buy or sell can only be traded incrementally, over periods of time as long as months. As a result order flow is a highly persistent long-memory process. Maintaining compatibility with market efficiency has profound consequences on price formation, on the dynamics of liquidity, and on the nature of impact. We review a body of theory that makes detailed quantitative predictions about the volume and time dependence of market impact, the bid-ask spread, order book dynamics, and volatility. Comparisons to data yield some encouraging successes. This framework suggests a novel interpretation of financial information, in which agents are at best only weakly informed and all have a similar and extremely noisy impact on prices. Most of the processed information appears to come from supply and demand itself, rather than from external news. The ideas reviewed here are relevant to market microstructure regulation, agent-based models, cost-optimal execution strategies, and understanding market ecologies.Comment: 111 pages, 24 figure

    The long memory of the efficient market

    Full text link
    For the London Stock Exchange we demonstrate that the signs of orders obey a long-memory process. The autocorrelation function decays roughly as τα\tau^{-\alpha} with α0.6\alpha \approx 0.6, corresponding to a Hurst exponent H0.7H \approx 0.7. This implies that the signs of future orders are quite predictable from the signs of past orders; all else being equal, this would suggest a very strong market inefficiency. We demonstrate, however, that fluctuations in order signs are compensated for by anti-correlated fluctuations in transaction size and liquidity, which are also long-memory processes. This tends to make the returns whiter. We show that some institutions display long-range memory and others don't.Comment: 19 pages, 12 figure

    Trading activity and price impact in parallel markets: SETS vs. off-book market at the London Stock Exchange

    Full text link
    We empirically study the trading activity in the electronic on-book segment and in the dealership off-book segment of the London Stock Exchange, investigating separately the trading of active market members and of other market participants which are non-members. We find that (i) the volume distribution of off-book transactions has a significantly fatter tail than the one of on-book transactions, (ii) groups of members and non-members can be classified in categories according to their trading profile (iii) there is a strong anticorrelation between the daily inventory variation of a market member due to the on-book market transactions and inventory variation due to the off-book market transactions with non-members, and (iv) the autocorrelation of the sign of the orders of non-members in the off-book market is slowly decaying. We also analyze the on-book price impact function over time, both for positive and negative lags, of the electronic trades and of the off-book trades. The unconditional impact curves are very different for the electronic trades and the off-book trades. Moreover there is a small dependence of impact on the volume for the on-book electronic trades, while the shape and magnitude of impact function of off-book transactions strongly depend on volume.Comment: 16 pages, 9 figure

    Studies of the limit order book around large price changes

    Full text link
    We study the dynamics of the limit order book of liquid stocks after experiencing large intra-day price changes. In the data we find large variations in several microscopical measures, e.g., the volatility the bid-ask spread, the bid-ask imbalance, the number of queuing limit orders, the activity (number and volume) of limit orders placed and canceled, etc. The relaxation of the quantities is generally very slow that can be described by a power law of exponent 0.4\approx0.4. We introduce a numerical model in order to understand the empirical results better. We find that with a zero intelligence deposition model of the order flow the empirical results can be reproduced qualitatively. This suggests that the slow relaxations might not be results of agents' strategic behaviour. Studying the difference between the exponents found empirically and numerically helps us to better identify the role of strategic behaviour in the phenomena.Comment: 19 pages, 7 figure

    A Theory for Market Impact: How Order Flow Affects Stock Price

    Full text link
    It is known that the impact of transactions on stock price (market impact) is a concave function of the size of the order, but there exists little quantitative theory that suggests why this is so. I develop a quantitative theory for the market impact of hidden orders (orders that reflect the true intention of buying and selling) that matches the empirically measured result and that reproduces some of the non-trivial and universal properties of stock returns (returns are percent changes in stock price). The theory is based on a simple premise, that the stock market can be modeled in a mechanical way - as a device that translates order flow into an uncorrelated price stream. Given that order flow is highly autocorrelated, this premise requires that market impact (1) depends on past order flow and (2) is asymmetric for buying and selling. I derive the specific form for the dependence in (1) by assuming that current liquidity responds to information about all currently active hidden orders (liquidity is a measure of the price response to a transaction of a given size). This produces an equation that suggests market impact should scale logarithmically with total order size. Using data from the London Stock Exchange I empirically measure market impact and show that the result matches the theory. Also using empirical data, I qualitatively specify the asymmetry of (2). Putting all results together, I form a model for market impact that reproduces three universal properties of stock returns - that returns are uncorrelated, that returns are distributed with a power law tail, and that the magnitude of returns is highly autocorrelated (also known as clustered volatility).Comment: PhD Thesis, University of Illinois at Urbana-Champaign (2007), 124 page
    corecore