3,217 research outputs found

    Stochastic kinetic models: Dynamic independence, modularity and graphs

    Full text link
    The dynamic properties and independence structure of stochastic kinetic models (SKMs) are analyzed. An SKM is a highly multivariate jump process used to model chemical reaction networks, particularly those in biochemical and cellular systems. We identify SKM subprocesses with the corresponding counting processes and propose a directed, cyclic graph (the kinetic independence graph or KIG) that encodes the local independence structure of their conditional intensities. Given a partition [A,D,B][A,D,B] of the vertices, the graphical separation ABDA\perp B|D in the undirected KIG has an intuitive chemical interpretation and implies that AA is locally independent of BB given ADA\cup D. It is proved that this separation also results in global independence of the internal histories of AA and BB conditional on a history of the jumps in DD which, under conditions we derive, corresponds to the internal history of DD. The results enable mathematical definition of a modularization of an SKM using its implied dynamics. Graphical decomposition methods are developed for the identification and efficient computation of nested modularizations. Application to an SKM of the red blood cell advances understanding of this biochemical system.Comment: Published in at http://dx.doi.org/10.1214/09-AOS779 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A stochastic approximation scheme and convergence theorem for particle interactions with perfectly reflecting boundaries

    Get PDF
    We prove the existence of a solution to an equation governing the number density within a compact domain of a discrete particle system for a prescribed class of particle interactions taking into account the effects of the diffusion and drift of the set of particles. Each particle carries a number of internal coordinates which may evolve continuously in time, determined by what we will refer to as the internal drift, or discretely via the interaction kernels. Perfectly reflecting boundary conditions are imposed on the system and all the processes may be spatially and temporally inhomogeneous. We use a relative compactness argument to construct a sequence of measures that converge weakly to a solution of the governing equation. Since the proof of existence is a constructive one, it provides a stochastic approximation scheme that can be used for the numerical study of molecular dynamics.Comment: 43 page

    Modelling Security Market Events in Continuous Time: Intensity Based, Multivariate Point Process Models

    Get PDF
    A continuous time econometric modelling framework for multivariate financial market event (or `transactions') data is developed in which the model is specified via the vector stochastic intensity. This has the advantage that the conditioning sigma-field is updated continuously in time as new information arrives. The class of generalised Hawkes models is introduced which allows the estimation of the dependence of the intensity on the events of previous trading days. Analytic likelihoods are available and it is shown how to construct diagnostic tests based on the transformation of non-Poisson processes into standard Poisson processes using random changes of time. A proof of the validity of the diagnostic testing procedures is given that imposes only a very weak condition on the point process model, thus establishing their widespread applicability. A continuous time, bivariate point process model of the timing of trades and mid-quote changes is presented for a New York Stock Exchange stock and the empirical findings are related to the theoretical and empirical market microstructure literature. The two-way interaction of trades and quote changes is found to be important empirically.Point and counting processes, multivariate, intensity, Hawkes process, diagnostics, goodness of fit, specification tests, change of time, transactions data, NYSE, market microstructure.

    Modelling Security Market Events in Continuous Time: Intensity Based, Multivariate Point Process Models

    Get PDF
    A continuous time econometric modelling framework for multivariate financial market event (or 'transactions') data is developed in which the model is specified via the vector conditional intensity. This has the advantage that the conditioning information set is updated continuously in time as new information arrives. Generalised Hawkes (g-Hawkes) models are introduced that are sufficiently flexible to incorporate `inhibitory' events and dependence between trading days. Novel omnibus specification tests for parametric models based on a multivariate random time change theorem are proposed. A computationally efficient thinning algorithm for simulation of g-Hawkes processes is also developed. A continuous time, bivariate point process model of the timing of trades and mid-quote changes is presented for a New York Stock Exchange stock and the empirical findings are related to the market microstructure literature. The two-way interaction of trades and quote changes is found to be important empirically. Furthermore, the model delivers a continuous record of instantaneous volatility that is conditional on the timing of trades and quote changes.Point process, conditional intensity, Hawkes process, specification test, random time change, transactions data, market microstructure.

    Stationarity and the term structure of interest rates: a characterisation of stationary and unit root yield curves

    Get PDF
    The nature of yield curve dynamics and the determinants of the integration order of yields are investigated using a benchmark economy in which the logarithmic expectations theory holds and the regularity condition of a limiting yield and limiting term premium is satisfied. By considering a zero-coupon yield curve with a complete term structure of maturities, a linear vector autoregressive process is constructed that provides an arbitrarily accurate moving average representation of the complete yield curve as its cross-sectional dimension (n) goes to infinity. We use this to prove the following novel results. First, any I(2) component vanishes owing to the almost sure (a.s.) convergence of the innovations to yields, vt(n), as n. Second, the yield curve is stationary if and only if nvt(n) converges a.s., or equivalently the innovations to log discount bond prices converge a.s.; otherwise yields are I(1). A necessary condition for either stationarity or the absence of arbitrage is that the limiting yield is constant over time. Since the time-varying component of term premia is small in various fixed-income markets, these results provide insight into the critical determinants of the stationarity properties of the term structure.Econometric models ; Interest rates

    The dynamics of economics functions: modelling and forecasting the yield curve

    Get PDF
    The class of Functional Signal plus Noise (FSN) models is introduced that provides a new, general method for modelling and forecasting time series of economic functions. The underlying, continuous economic function (or "signal") is a natural cubic spline whose dynamic evolution is driven by a cointegrated vector autoregression for the ordinates (or "y-values") at the knots of the spline. The natural cubic spline provides flexible cross-sectional fit and results in a linear, state space model. This FSN model achieves dimension reduction, provides a coherent description of the observed yield curve and its dynamics as the cross-sectional dimension N becomes large, and can feasibly be estimated and used for forecasting when N is large. The integration and cointegration properties of the model are derived. The FSN models are then applied to forecasting 36-dimensional yield curves for US Treasury bonds at the one month ahead horizon. The method consistently outperforms the Diebold and Li (2006) and random walk forecasts on the basis of both mean square forecast error criteria and economically relevant loss functions derived from the realised profits of pairs trading algorithms. The analysis also highlights in a concrete setting the dangers of attempts to infer the relative economic value of model forecasts on the basis of their associated mean square forecast errors.Time-series analysis ; Forecasting ; Mathematical models ; Macroeconomics - Econometric models

    Paradigm not procedure: current challenges to police cultural incorporation of human rights in England and Wales

    Full text link
    This paper examines the extent to which human rights have become an organizational norm in UK policing, a decade after the Human Rights Act 1998 was enacted. Particular reference is made to covert investigation and human rights in the context of new police powers enacted after the HRA 1998 which extended police access to and use of covert investigation. The paper concludes that attempts to achieve transparency through documentation has created the impression that protection of human rights is a bureaucratic process rather than a cultural paradigm

    Nitrate pollution from horticultural production systems : tools for policy and advice from field to catchment scales

    Get PDF
    The implementation of the Nitrates Directive has imposed a requirement to restrict N fertiliser and manuring practices on farms across the EU in order to reduce nitrate losses to water. These requirements have since been extended by the more demanding Water Framework Directive, which broadens the focus from the control of farm practices to a consideration of the impacts of pollutants from all sources on water quality at a catchment or larger scale. Together, these Directives set limits for water quality, and identify general strategies for how these might be achieved. However, it is the responsibility of policy makers in each Nation State to design the details of the management practices and environmental protection measures required to meet the objectives of the legislation, to ensure they are appropriate for their specific types of land use and climate. This paper describes various modelling tools for comparing different cropping and land use strategies, and illustrates with examples how they can inform policy makers about the environmental benefits of changing management practices and how to prioritise them. The results can help to provide the specific advice on N fertiliser and land use management required by farmers and growers at a field scale, and by environmental managers at a catchment or larger scale. A further example of how results from multiple catchments can be up-scaled and compared using Geographic Information Systems is also outlined
    corecore