64 research outputs found

    Economic Development Region: Revelstoke

    Get PDF
    Rural ResilienceEconomic Developmen

    Characterisation and Estimation of Entropy Rate for Long Range Dependent Processes

    Get PDF
    Much of the theory of random processes has been developed with the assumption that distant time periods are weakly correlated. However, it has been discovered in many real-world phenomena that this assumption is not valid. These findings have resulted in extensive research interest into stochastic processes that have strong correlations that persist over long time periods. This phenomenon is called long range dependence. This phenomena has been defined in the time and frequency domains by the slow decay of their autocorrelation function and the existence of a pole at the origin of the spectral density function, respectively. Information theory has proved very useful in statistics and probability theory. However, there has not been much research into the information theoretic properties and characterisations of this phenomena.This thesis characterises the phenomena of long range dependence, for discrete and continuous-valued stochastic processes in discrete time, by an information theoretic measure, the entropy rate. The entropy rate measures the amount of information contained in a stochastic process on average, per random variable. Common characterisations of long range dependence in the time and frequency domains are given by the slow convergence to quantities of interest, such as the sample mean. We show that this type of behaviour is present in the entropy rate function, by showing that long range dependence also has slow convergence of the conditional entropy to the entropy rate, due to some entropic quantities diverging to infinity. As an extension we show for classes of Gaussian processes and Markov chains that long range dependence by an infinite amount of shared information between the past and future of a stochastic process. The slow convergence has the impact of making accurate estimation of the differential entropy rate on data from long range dependent processes difficult, to the extent that existing techniques either are not accurate or are computationally intensive. We introduce a new estimation technique, that is able to balance these two concerns and make quick and accurate estimates of the differential entropy rate from continuous-valued data. We develop and utilise a connection between the differential entropy rate and the Shannon entropy rate of its quantised process as the basis of the estimation technique. This allows us to draw on the extensive research into Shannon entropy rate estimation on discrete-valued data, and we show that properties for the differential entropy rate estimator can be inherited from the choice of Shannon entropy rate estimator.Thesis (Ph.D.) -- University of Adelaide, School of Mathematical Sciences, 202

    The Murray Ledger, May 2, 1918

    Get PDF

    Wastewater Triad Project: Final Summary Report

    Full text link

    The Chronicle [January 31, 1986]

    Get PDF
    The Chronicle, January 31, 1986https://repository.stcloudstate.edu/chron/3541/thumbnail.jp

    The Daily Egyptian, April 04, 1994

    Get PDF

    Princeton Banner, February 27, 1879

    Get PDF

    Differential Mortality in Mississippi.

    Get PDF

    Mount Vernon Democratic Banner June 1, 1867

    Get PDF
    Mount Vernon Democratic Banner was a newspaper published weekly in Mount Vernon, Ohio. Until 1853, it was published as the Democratic Banner.https://digital.kenyon.edu/banner1867/1021/thumbnail.jp
    corecore