100,358 research outputs found
Fast Autocorrelated Context Models for Data Compression
A method is presented to automatically generate context models of data by
calculating the data's autocorrelation function. The largest values of the
autocorrelation function occur at the offsets or lags in the bitstream which
tend to be the most highly correlated to any particular location. These offsets
are ideal for use in predictive coding, such as predictive partial match (PPM)
or context-mixing algorithms for data compression, making such algorithms more
efficient and more general by reducing or eliminating the need for ad-hoc
models based on particular types of data. Instead of using the definition of
the autocorrelation function, which considers the pairwise correlations of data
requiring O(n^2) time, the Weiner-Khinchin theorem is applied, quickly
obtaining the autocorrelation as the inverse Fast Fourier transform of the
data's power spectrum in O(n log n) time, making the technique practical for
the compression of large data objects. The method is shown to produce the
highest levels of performance obtained to date on a lossless image compression
benchmark.Comment: v2 includes bibliograph
The "spurious regression problem" in the classical regression model framework
I analyse the "spurious regression problem" from the Classical Regression Model (CRM) point of view. Simulations show that the autocorrelation corrections suggested by the CRM, e.g., feasible generalised least squares, solve the problem. Estimators are unbiased, consistent, efficient and deliver correctly sized tests. Conversely, first differencing the data results in inefficiencies when the autoregressive parameter in the error process is less than one. I offer practical recommendations for handling cases suspected to be in the "spurious regression" class.spurious regression, classical regression model, generalised least squares, autocorrelation corrections
On detecting the large separation in the autocorrelation of stellar oscillation times series
The observations carried out by the space missions CoRoT and Kepler provide a
large set of asteroseismic data. Their analysis requires an efficient procedure
first to determine if the star is reliably showing solar-like oscillations,
second to measure the so-called large separation, third to estimate the
asteroseismic information that can be retrieved from the Fourier spectrum. We
develop in this paper a procedure, based on the autocorrelation of the seismic
Fourier spectrum. We have searched for criteria able to predict the output that
one can expect from the analysis by autocorrelation of a seismic time series.
First, the autocorrelation is properly scaled for taking into account the
contribution of white noise. Then, we use the null hypothesis H0 test to assess
the reliability of the autocorrelation analysis. Calculations based on solar
and CoRoT times series are performed in order to quantify the performance as a
function of the amplitude of the autocorrelation signal. We propose an
automated determination of the large separation, whose reliability is
quantified by the H0 test. We apply this method to analyze a large set of red
giants observed by CoRoT. We estimate the expected performance for photometric
time series of the Kepler mission. Finally, we demonstrate that the method
makes it possible to distinguish l=0 from l=1 modes. The envelope
autocorrelation function has proven to be very powerful for the determination
of the large separation in noisy asteroseismic data, since it enables us to
quantify the precision of the performance of different measurements: mean large
separation, variation of the large separation with frequency, small separation
and degree identification.Comment: A&A, in pres
Statistical properties of the attendance time series in the minority game
We study the statistical properties of the attendance time series
corresponding to the number of agents making a particular decision in the
minority game (MG). We focus on the analysis of the probability distribution
and the autocorrelation function of the attendance over a time interval in the
efficient phase of the game. In this regime both the probability distribution
and the autocorrelation function are shown to have similar behaviour for time
differences corresponding to multiples of , which is twice the
number of possible history bit strings in a MG with agents making decisions
based on the most recent outcomes of the game.Comment: 3 pages, 4 Postscript figures, \documentstyle[aps,epsf]{revtex
With missing title of the paper. Overrelaxation Algorithm for coupled Gauge-Higgs systems
In this letter we extent the overrelaxation algorithm, known to be very
efficient in gauge theories, to coupled gauge-Higgs systems with a particular
emphasis on the update of the radial mode of the Higgs field. Our numerical
tests of the algorithm show that the autocorrelation times can be reduced
substantially.Comment: 10pages, DESY-95-04
Attacking the combination generator
We present one of the most efficient attacks against the combination
generator. This attack is inherent to this system as its only assumption is
that the filtering function has a good autocorrelation. This is usually the
case if the system is designed to be resistant to other kinds of attacks. We
use only classical tools, namely vectorial correlation, weight 4 multiples and
Walsh transform
- …