5,293 research outputs found

    The power of patience: A behavioral regularity in limit order placement

    Full text link
    In this paper we demonstrate a striking regularity in the way people place limit orders in financial markets, using a data set consisting of roughly seven million orders from the London Stock Exchange. We define the relative limit price as the difference between the limit price and the best price available. Merging the data from 50 stocks, we demonstrate that for both buy and sell orders, the unconditional cumulative distribution of relative limit prices decays roughly as a power law with exponent approximately 1.5. This behavior spans more than two decades, ranging from a few ticks to about 2000 ticks. Time series of relative limit prices show interesting temporal structure, characterized by an autocorrelation function that asymptotically decays as tau^(-0.4). Furthermore, relative limit price levels are positively correlated with and are led by price volatility. This feedback may potentially contribute to clustered volatility

    Sequential inverse problems Bayesian principles and the\ud logistic map example

    Get PDF
    Bayesian statistics provides a general framework for solving inverse problems, but is not without interpretation and implementation problems. This paper discusses difficulties arising from the fact that forward models are always in error to some extent. Using a simple example based on the one-dimensional logistic map, we argue that, when implementation problems are minimal, the Bayesian framework is quite adequate. In this paper the Bayesian Filter is shown to be able to recover excellent state estimates in the perfect model scenario (PMS) and to distinguish the PMS from the imperfect model scenario (IMS). Through a quantitative comparison of the way in which the observations are assimilated in both the PMS and the IMS scenarios, we suggest that one can, sometimes, measure the degree of imperfection

    The Predictive Power of Zero Intelligence in Financial Markets

    Full text link
    Standard models in economics stress the role of intelligent agents who maximize utility. However, there may be situations where, for some purposes, constraints imposed by market institutions dominate intelligent agent behavior. We use data from the London Stock Exchange to test a simple model in which zero intelligence agents place orders to trade at random. The model treats the statistical mechanics of order placement, price formation, and the accumulation of revealed supply and demand within the context of the continuous double auction, and yields simple laws relating order arrival rates to statistical properties of the market. We test the validity of these laws in explaining the cross-sectional variation for eleven stocks. The model explains 96% of the variance of the bid-ask spread, and 76% of the variance of the price diffusion rate, with only one free parameter. We also study the market impact function, describing the response of quoted prices to the arrival of new orders. The non-dimensional coordinates dictated by the model approximately collapse data from different stocks onto a single curve. This work is important from a practical point of view because it demonstrates the existence of simple laws relating prices to order flows, and in a broader context, because it suggests that there are circumstances where institutions are more important than strategic considerations

    Data assimilation using bayesian filters and B-spline geological models

    Get PDF
    This paper proposes a new approach to problems of data assimilation, also known as history matching, of oilfield production data by adjustment of the location and sharpness of patterns of geological facies. Traditionally, this problem has been addressed using gradient based approaches with a level set parameterization of the geology. Gradient-based methods are robust, but computationally demanding with real-world reservoir problems and insufficient for reservoir management uncertainty assessment. Recently, the ensemble filter approach has been used to tackle this problem because of its high efficiency from the standpoint of implementation, computational cost, and performance. Incorporation of level set parameterization in this approach could further deal with the lack of differentiability with respect to facies type, but its practical implementation is based on some assumptions that are not easily satisfied in real problems. In this work, we propose to describe the geometry of the permeability field using B-spline curves. This transforms history matching of the discrete facies type to the estimation of continuous B-spline control points. As filtering scheme, we use the ensemble square-root filter (EnSRF). The efficacy of the EnSRF with the B-spline parameterization is investigated through three numerical experiments, in which the reservoir contains a curved channel, a disconnected channel or a 2-dimensional closed feature. It is found that the application of the proposed method to the problem of adjusting facies edges to match production data is relatively straightforward and provides statistical estimates of the distribution of geological facies and of the state of the reservoir

    Full Wave Form Inversion for Seismic Data

    Get PDF
    In seismic wave inversion, seismic waves are sent into the ground and then observed at many receiving points with the aim of producing high-resolution images of the geological underground details. The challenge presented by Saudi Aramco is to solve the inverse problem for multiple point sources on the full elastic wave equation, taking into account all frequencies for the best resolution. The state-of-the-art methods use optimisation to find the seismic properties of the rocks, such that when used as the coefficients of the equations of a model, the measurements are reproduced as closely as possible. This process requires regularisation if one is to avoid instability. The approach can produce a realistic image but does not account for uncertainty arising, in general, from the existence of many different patterns of properties that also reproduce the measurements. In the Study Group a formulation of the problem was developed, based upon the principles of Bayesian statistics. First the state-of-the-art optimisation method was shown to be a special case of the Bayesian formulation. This result immediately provides insight into the most appropriate regularisation methods. Then a practical implementation of a sequential sampling algorithm, using forms of the Ensemble Kalman Filter, was devised and explored

    Qualitative measures of the responsiveness of postsecondary technical programs to the needs of industry

    Get PDF
    Institutional responsiveness is a measure of how responsive an institution is to its market. The market includes all stakeholders (students, businesses, and the community). Several major issues were examined in this study, including responsiveness to the labor market, job placement, graduation rate, and quality of program. Implications from this study should be quite useful to IVETA and members of our global community. The study was funded by a grant from the Pennsylvania Bureau of Vocational Technical Education under the auspices of the Carl D. Perkins Vocational Education Act (Public Law 105-332-Oct. 31, 1998)

    Properties of Carbon-Oxygen White Dwarfs From Monte Carlo Stellar Models

    Get PDF
    We investigate properties of carbon-oxygen white dwarfs with respect to the composite uncertainties in the reaction rates using the stellar evolution toolkit, Modules for Experiments in Stellar Astrophysics (MESA) and the probability density functions in the reaction rate library STARLIB. These are the first Monte Carlo stellar evolution studies that use complete stellar models. Focusing on 3 M_{\odot} models evolved from the pre main-sequence to the first thermal pulse, we survey the remnant core mass, composition, and structure properties as a function of 26 STARLIB reaction rates covering hydrogen and helium burning using a Principal Component Analysis and Spearman Rank-Order Correlation. Relative to the arithmetic mean value, we find the width of the 95\% confidence interval to be ΔM1TP\Delta M_{{\rm 1TP}} \approx 0.019 M_{\odot} for the core mass at the first thermal pulse, Δ\Deltat1TPt_{\rm{1TP}} \approx 12.50 Myr for the age, Δlog(Tc/K)\Delta \log(T_{{\rm c}}/{\rm K}) \approx 0.013 for the central temperature, Δlog(ρc/g cm3)\Delta \log(\rho_{{\rm c}}/{\rm g \ cm}^{-3}) \approx 0.060 for the central density, ΔYe,c\Delta Y_{\rm{e,c}} \approx 2.6×\times105^{-5} for the central electron fraction, ΔXc(22Ne)\Delta X_{\rm c}(^{22}\rm{Ne}) \approx 5.8×\times104^{-4}, ΔXc(12C)\Delta X_{\rm c}(^{12}\rm{C}) \approx 0.392, and ΔXc(16O)\Delta X_{\rm c}(^{16}\rm{O}) \approx 0.392. Uncertainties in the experimental 12^{12}C(α,γ)16O\alpha,\gamma)^{16}\rm{O}, triple-α\alpha, and 14^{14}N(p,γ)15Op,\gamma)^{15}\rm{O} reaction rates dominate these variations. We also consider a grid of 1 to 6 M_{\odot} models evolved from the pre main-sequence to the final white dwarf to probe the sensitivity of the initial-final mass relation to experimental uncertainties in the hydrogen and helium reaction rates.Comment: Accepted for publication in The Astrophysical Journal; 19 Pages, 23 Figures, 5 Table

    Developing the competencies of interactional justice

    Get PDF
    Grounded in social exchange theory, interpersonal and informational justice (collectively “IJ”) reflect the degree to which people affected by organizational decision makers perceive that they have been treated in a dignified and informative manner. Empirical research shows that IJ is positively correlated with myriad beneficial organizational outcomes (e.g., performance, job satisfaction and trust in authority figures) and negatively correlated with several noxious ones (e.g., withdrawal, negative reaction to decisions). The presence of IJ is an important mitigating factor in accepting negative organizational outcomes. In addition, the negative impact of injustice on an individual’s self-esteem can have profound implications for relationships among organizational stakeholders. The platform for introducing learners to IJ is a skills-based design for identification and use of fair behaviors. The experiential exercise is also designed to facilitate observational skills in seeing the consequences of IJ in organizational life – particularly as its presence or absence affects the communication flow in various interactions between managers and their subordinates

    CROSS-COMPLIANCE Facilitating the CAP reform: Compliance and competitiveness of European agriculture Specific Targeted Research or Innovation Project (STREP) Integrating and Strengthening the European Research Area : Deliverable 13 : Product-based assessments to link compliance to standards at farm level to competitiveness

    Get PDF
    This report summarizes the main results from the Cross-Compliance project The core aim of this EU funded research project is to analyse the external competitiveness impact arising from an improvement in the level of compliance with mandatory standard
    corecore