2,242 research outputs found

    Spectral correlation density estimation via minimum variance distortion-less response filterbanks

    Get PDF

    Measuring gravitational waves from binary black hole coalescences: II. the waves' information and its extraction, with and without templates

    Get PDF
    We discuss the extraction of information from detected binary black hole (BBH) coalescence gravitational waves, focusing on the merger phase that occurs after the gradual inspiral and before the ringdown. Our results are: (1) If numerical relativity simulations have not produced template merger waveforms before BBH detections by LIGO/VIRGO, one can band-pass filter the merger waves. For BBHs smaller than about 40 solar masses detected via their inspiral waves, the band pass filtering signal to noise ratio indicates that the merger waves should typically be just barely visible in the noise for initial and advanced LIGO interferometers. (2) We derive an optimized (maximum likelihood) method for extracting a best-fit merger waveform from the noisy detector output; one "perpendicularly projects" this output onto a function space (specified using wavelets) that incorporates our prior knowledge of the waveforms. An extension of the method allows one to extract the BBH's two independent waveforms from outputs of several interferometers. (3) If numerical relativists produce codes for generating merger templates but running the codes is too expensive to allow an extensive survey of the merger parameter space, then a coarse survey of this parameter space, to determine the ranges of the several key parameters and to explore several qualitative issues which we describe, would be useful for data analysis purposes. (4) A complete set of templates could be used to test the nonlinear dynamics of general relativity and to measure some of the binary parameters. We estimate the number of bits of information obtainable from the merger waves (about 10 to 60 for LIGO/VIRGO, up to 200 for LISA), estimate the information loss due to template numerical errors or sparseness in the template grid, and infer approximate requirements on template accuracy and spacing.Comment: 33 pages, Rextex 3.1 macros, no figures, submitted to Phys Rev

    An Amplitude Spectral Capon Estimator with a Variable Filter Length

    Get PDF
    Publication in the conference proceedings of EUSIPCO, Bucharest, Romania, 201

    Classical and Bayesian Linear Data Estimators for Unique Word OFDM

    Full text link
    Unique word - orthogonal frequency division multiplexing (UW-OFDM) is a novel OFDM signaling concept, where the guard interval is built of a deterministic sequence - the so-called unique word - instead of the conventional random cyclic prefix. In contrast to previous attempts with deterministic sequences in the guard interval the addressed UW-OFDM signaling approach introduces correlations between the subcarrier symbols, which can be exploited by the receiver in order to improve the bit error ratio performance. In this paper we develop several linear data estimators specifically designed for UW-OFDM, some based on classical and some based on Bayesian estimation theory. Furthermore, we derive complexity optimized versions of these estimators, and we study their individual complex multiplication count in detail. Finally, we evaluate the estimators' performance for the additive white Gaussian noise channel as well as for selected indoor multipath channel scenarios.Comment: Preprint, 13 page

    The Theory and Practice of Estimating the Accuracy of Dynamic Flight-Determined Coefficients

    Get PDF
    Means of assessing the accuracy of maximum likelihood parameter estimates obtained from dynamic flight data are discussed. The most commonly used analytical predictors of accuracy are derived and compared from both statistical and simplified geometrics standpoints. The accuracy predictions are evaluated with real and simulated data, with an emphasis on practical considerations, such as modeling error. Improved computations of the Cramer-Rao bound to correct large discrepancies due to colored noise and modeling error are presented. The corrected Cramer-Rao bound is shown to be the best available analytical predictor of accuracy, and several practical examples of the use of the Cramer-Rao bound are given. Engineering judgement, aided by such analytical tools, is the final arbiter of accuracy estimation

    Gravitational waves from coalescing binaries: detection strategies and Monte Carlo estimation of parameters

    Get PDF
    The paper deals with issues pertaining the detection of gravitational waves from coalescing binaries. We introduce the application of differential geometry to the problem of optimal detection of the `chirp signal'. We have also carried out extensive Monte Carlo simulations to understand the errors in the estimation of parameters of the binary system. We find that the errors are much more than those predicted by the covariance matrix even at a high SNR of 10-15. We also introduce the idea of using the instant of coalescence rather than the time of arrival to determine the direction to the source.Comment: 28 pages, REVTEX, 12 figures (bundled via uufiles command along with this paper) submitted to Phys. Rev.

    New Eurocoin: Tracking Economic Growth in Real Time

    Get PDF
    This paper presents ideas and methods underlying the construction of an indicator that tracks the euro area GDP growth, but, unlike GDP growth, (i) is updated monthly and almost in real time; (ii) is free from hort-run dynamics. Removal of short-run dynamics from a time series, to isolate the mediumlong-run component, can be obtained by a band-pass filter. However, it is well known that band-pass filters, being two-sided, perform very poorly at the end of the sample. New Eurocoin is an estimator of the medium- long-run component of the GDP that only uses contemporaneous values of a large panel of macroeconomic time series, so that no end-of-sample deterioration occurs. Moreover, as our dataset is monthly, New Eurocoin can be updated each month and with a very short delay. Our method is based on generalized principal components that are designed to use leading variables in the dataset as proxies for future values of the GDP growth. As the medium- long-run component of the GDP is observable, although with delay, the performance of New Eurocoin at the end of the sample can be measured.coincident indicator, band-pass filter, large-dataset factor models, generalized principal components
    • …
    corecore