7,173 research outputs found

    A critical look at power law modelling of the Internet

    Get PDF
    This paper takes a critical look at the usefulness of power law models of the Internet. The twin focuses of the paper are Internet traffic and topology generation. The aim of the paper is twofold. Firstly it summarises the state of the art in power law modelling particularly giving attention to existing open research questions. Secondly it provides insight into the failings of such models and where progress needs to be made for power law research to feed through to actual improvements in network performance.Comment: To appear Computer Communication

    Exact simulation of gamma-driven Ornstein–Uhlenbeck processes with finite and infinite activity jumps

    Get PDF
    We develop a distributional decomposition approach for exactly simulating two types of Gamma-driven Ornstein–Uhlenbeck (OU) processes with time-varying marginal distributions: the Gamma-OU process and the OU-Gamma process. The former has finite-activity jumps, and its marginal distribution is asymptotically Gamma; the latter has infinite-activity jumps that are driven by a Gamma process. We prove that the transition distributions of the two processes at any given time can be exactly decomposed into simple elements: at any given time, the former is equal in distribution to the sum of one deterministic trend and one compound Poisson random variable (r.v.); the latter is equal in distribution to the sum of one deterministic trend, one compound Poisson r.v., and one Gamma r.v.. The results immediately lead to very efficient algorithms for their exact simulations without numerical inversion. Extensive numerical experiments are reported to demonstrate the accuracy and efficiency of our algorithms

    Pilot interaction with automated airborne decision making systems

    Get PDF
    An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered

    Event series prediction via non-homogeneous Poisson process modelling

    Get PDF
    Data streams whose events occur at random arrival times rather than at the regular, tick-tock intervals of traditional time series are increasingly prevalent. Event series are continuous, irregular and often highly sparse, differing greatly in nature to the regularly sampled time series traditionally the concern of hard sciences. As mass sets of such data have become more common, so interest in predicting future events in them has grown. Yet repurposing of traditional forecasting approaches has proven ineffective, in part due to issues such as sparsity, but often due to inapplicable underpinning assumptions such as stationarity and ergodicity. In this paper we derive a principled new approach to forecasting event series that avoids such assumptions, based upon: 1. the processing of event series datasets in order to produce a parameterized mixture model of non-homogeneous Poisson processes; and 2. application of a technique called parallel forecasting that uses these processes’ rate functions to directly generate accurate temporal predictions for new query realizations. This approach uses forerunners of a stochastic process to shed light on the distribution of future events, not for themselves, but for realizations that subsequently follow in their footsteps

    Modelling realized variance when returns are serially correlated

    Get PDF
    This article examines the impact of serial correlation in high frequency returns on the realized variance measure. In particular, it is shown that the realized variance measure yields a biased estimate of the conditional return variance when returns are serially correlated. Using 10 years of FTSE-100 minute by minute data we demonstrate that a careful choice of sampling frequency is crucial in avoiding substantial biases. Moreover, we find that the autocovariance structure (magnitude and rate of decay) of FTSE-100 returns at different sampling frequencies is consistent with that of an ARMA process under temporal aggregation. A simple autocovariance function based method is proposed for choosing the “optimal” sampling frequency, that is, the highest available frequency at which the serial correlation of returns has a negligible impact on the realized variance measure. We find that the logarithmic realized variance series of the FTSE-100 index, constructed using an optimal sampling frequency of 25 minutes, can be modelled as an ARFIMA process. Exogenous variables such as lagged returns and contemporaneous trading volume appear to be highly significant regressors and are able to explain a large portion of the variation in daily realized variance. -- Dieser Artikel untersucht die Auswirkungen von autokorrelierten Erträgen auf das Maß der realisierten Varianz bei hochfrequenten Daten über die Erträge. Es wird gezeigt, dass die realisierte Varianz ein verzerrter Schätzer für die bedingte Varianz der Erträge bei Vorliegen von Autokorrelation ist. Unter Verwendung eines zehnjährigen Datensatzes von Minutendaten des FTSE-100 wird dargestellt, dass eine sorgfältige Auswahl der Stichprobenfrequenz unabdingbar zur Vermeidung von Verzerrungen ist. Eine einfache Methode zur Bestimmung der optimalen Stichprobenfrequenz, basierend auf der Autokovarianzfunktion, wird vorgeschlagen. Diese ergibt sich als die höchste Frequenz, bei der die vorhandene Autokorrelation noch einen vernachlässigbaren Einfluss auf das Maß der realisierten Varianz hat. Für den betrachteten Datensatz ergibt sich eine optimale Frequenz von 25 Minuten. Unter Verwendung dieser Frequenz können die logarithmierten Erträge des FTSE-100 als ARFIMA Prozess modelliert werden.High frequency data,realized return variance,market microstructure,temporal aggregation,long memory,bootstrap
    • …
    corecore