7,205 research outputs found
Statistical analysis of low level atmospheric turbulence
The statistical properties of low-level wind-turbulence data were obtained with the model 1080 total vector anemometer and the model 1296 dual split-film anemometer, both manufactured by Thermo Systems Incorporated. The data obtained from the above fast-response probes were compared with the results obtained from a pair of Gill propeller anemometers. The digitized time series representing the three velocity components and the temperature were each divided into a number of blocks, the length of which depended on the lowest frequency of interest and also on the storage capacity of the available computer. A moving-average and differencing high-pass filter was used to remove the trend and the low frequency components in the time series. The calculated results for each of the anemometers used are represented in graphical or tabulated form
Kernel Density Estimation with Linked Boundary Conditions
Kernel density estimation on a finite interval poses an outstanding challenge
because of the well-recognized bias at the boundaries of the interval.
Motivated by an application in cancer research, we consider a boundary
constraint linking the values of the unknown target density function at the
boundaries. We provide a kernel density estimator (KDE) that successfully
incorporates this linked boundary condition, leading to a non-self-adjoint
diffusion process and expansions in non-separable generalized eigenfunctions.
The solution is rigorously analyzed through an integral representation given by
the unified transform (or Fokas method). The new KDE possesses many desirable
properties, such as consistency, asymptotically negligible bias at the
boundaries, and an increased rate of approximation, as measured by the AMISE.
We apply our method to the motivating example in biology and provide numerical
experiments with synthetic data, including comparisons with state-of-the-art
KDEs (which currently cannot handle linked boundary constraints). Results
suggest that the new method is fast and accurate. Furthermore, we demonstrate
how to build statistical estimators of the boundary conditions satisfied by the
target function without apriori knowledge. Our analysis can also be extended to
more general boundary conditions that may be encountered in applications
Bayesian interpretation of periodograms
The usual nonparametric approach to spectral analysis is revisited within the
regularization framework. Both usual and windowed periodograms are obtained as
the squared modulus of the minimizer of regularized least squares criteria.
Then, particular attention is paid to their interpretation within the Bayesian
statistical framework. Finally, the question of unsupervised hyperparameter and
window selection is addressed. It is shown that maximum likelihood solution is
both formally achievable and practically useful
Stochastic partial differential equation based modelling of large space-time data sets
Increasingly larger data sets of processes in space and time ask for
statistical models and methods that can cope with such data. We show that the
solution of a stochastic advection-diffusion partial differential equation
provides a flexible model class for spatio-temporal processes which is
computationally feasible also for large data sets. The Gaussian process defined
through the stochastic partial differential equation has in general a
nonseparable covariance structure. Furthermore, its parameters can be
physically interpreted as explicitly modeling phenomena such as transport and
diffusion that occur in many natural processes in diverse fields ranging from
environmental sciences to ecology. In order to obtain computationally efficient
statistical algorithms we use spectral methods to solve the stochastic partial
differential equation. This has the advantage that approximation errors do not
accumulate over time, and that in the spectral space the computational cost
grows linearly with the dimension, the total computational costs of Bayesian or
frequentist inference being dominated by the fast Fourier transform. The
proposed model is applied to postprocessing of precipitation forecasts from a
numerical weather prediction model for northern Switzerland. In contrast to the
raw forecasts from the numerical model, the postprocessed forecasts are
calibrated and quantify prediction uncertainty. Moreover, they outperform the
raw forecasts, in the sense that they have a lower mean absolute error
Investigating Economic Trends And Cycles
Methods are described for extracting the trend from an economic data sequence and for isolating the cycles that surround it. The latter often consist of a business cycle of variable duration and a perennial seasonal cycle. There is no evident point in the frequency spectrum where the trend ends and the business cycle begins. Therefore, unless it can be represented by a simple analytic function, such as an exponential growth path, there is bound to be a degree of arbitrariness in the definition of the trend. The business cycle, however defined, is liable to have an upper limit to its frequency range that falls short of the Nyquist frequency, which is the maximum observable frequency in sampled data. This must be taken into account in fitting an ARMA model to the detrended data.
A kepstrum approach to filtering, smoothing and prediction
The kepstrum (or complex cepstrum) method is revisited and applied to the problem of spectral factorization
where the spectrum is directly estimated from observations. The solution to this problem in turn leads to a new
approach to optimal filtering, smoothing and prediction using the Wiener theory. Unlike previous approaches to
adaptive and self-tuning filtering, the technique, when implemented, does not require a priori information on the
type or order of the signal generating model. And unlike other approaches - with the exception of spectral
subtraction - no state-space or polynomial model is necessary. In this first paper results are restricted to
stationary signal and additive white noise
Variational Data Assimilation via Sparse Regularization
This paper studies the role of sparse regularization in a properly chosen
basis for variational data assimilation (VDA) problems. Specifically, it
focuses on data assimilation of noisy and down-sampled observations while the
state variable of interest exhibits sparsity in the real or transformed domain.
We show that in the presence of sparsity, the -norm regularization
produces more accurate and stable solutions than the classic data assimilation
methods. To motivate further developments of the proposed methodology,
assimilation experiments are conducted in the wavelet and spectral domain using
the linear advection-diffusion equation
- …