8,586 research outputs found

    Estimation from quantized Gaussian measurements: when and how to use dither

    Full text link
    Subtractive dither is a powerful method for removing the signal dependence of quantization noise for coarsely quantized signals. However, estimation from dithered measurements often naively applies the sample mean or midrange, even when the total noise is not well described with a Gaussian or uniform distribution. We show that the generalized Gaussian distribution approximately describes subtractively dithered, quantized samples of a Gaussian signal. Furthermore, a generalized Gaussian fit leads to simple estimators based on order statistics that match the performance of more complicated maximum likelihood estimators requiring iterative solvers. The order statistics-based estimators outperform both the sample mean and midrange for nontrivial sums of Gaussian and uniform noise. Additional analysis of the generalized Gaussian approximation yields rules of thumb for determining when and how to apply dither to quantized measurements. Specifically, we find subtractive dither to be beneficial when the ratio between the Gaussian standard deviation and quantization interval length is roughly less than one-third. When that ratio is also greater than 0.822/K^0.930 for the number of measurements K > 20, estimators we present are more efficient than the midrange.https://arxiv.org/abs/1811.06856Accepted manuscrip

    Forecasting Stock Index Volatility: The Incremental Information in the Intraday High-Low Price Range

    Get PDF
    We compare the incremental information content of implied volatility and intraday high-low range volatility in the context of conditional volatilityforecasts for three major market indexes: the S&P 100, the S&P 500, and the Nasdaq 100. Evidence obtained from out-of-sample volatility forecasts indicates that neither implied volatility nor intraday high-low range volatility subsumes entirely the incremental information contained in the other. Our findings suggest that intraday high-low range volatility can usefully augment conditional volatility forecasts for these market indexes.

    On the distribution of an effective channel estimator for multi-cell massive MIMO

    Get PDF
    Accurate channel estimation is of utmost importance for massive MIMO systems to provide significant improvements in spectral and energy efficiency. In this work, we present a study on the distribution of a simple but yet effective and practical channel estimator for multi-cell massive MIMO systems suffering from pilot-contamination. The proposed channel estimator performs well under moderate to aggressive pilot contamination scenarios without previous knowledge of the inter-cell large-scale channel coefficients and noise power, asymptotically approximating the performance of the linear MMSE estimator as the number of antennas increases. We prove that the distribution of the proposed channel estimator can be accurately approximated by the circularly-symmetric complex normal distribution, when the number of antennas, M, deployed at the base station is greater than 10

    Constrained correlation functions from the Millennium Simulation

    Full text link
    Context. In previous work, we developed a quasi-Gaussian approximation for the likelihood of correlation functions, which, in contrast to the usual Gaussian approach, incorporates fundamental mathematical constraints on correlation functions. The analytical computation of these constraints is only feasible in the case of correlation functions of one-dimensional random fields. Aims. In this work, we aim to obtain corresponding constraints in the case of higher-dimensional random fields and test them in a more realistic context. Methods. We develop numerical methods to compute the constraints on correlation functions which are also applicable for two- and three-dimensional fields. In order to test the accuracy of the numerically obtained constraints, we compare them to the analytical results for the one-dimensional case. Finally, we compute correlation functions from the halo catalog of the Millennium Simulation, check whether they obey the constraints, and examine the performance of the transformation used in the construction of the quasi-Gaussian likelihood. Results. We find that our numerical methods of computing the constraints are robust and that the correlation functions measured from the Millennium Simulation obey them. Despite the fact that the measured correlation functions lie well inside the allowed region of parameter space, i.e. far away from the boundaries of the allowed volume defined by the constraints, we find strong indications that the quasi-Gaussian likelihood yields a substantially more accurate description than the Gaussian one.Comment: 11 pages, 13 figures, updated to match version accepted by A&

    Modeling autoregressive conditional skewness and kurtosis with multi-quantile CAViaR

    Get PDF
    Engle and Manganelli (2004) propose CAViaR, a class of models suitable for estimating conditional quantiles in dynamic settings. Engle and Manganelli apply their approach to the estimation of Value at Risk, but this is only one of many possible applications. Here we extend CAViaR models to permit joint modeling of multiple quantiles, Multi-Quantile (MQ) CAViaR. We apply our new methods to estimate measures of conditional skewness and kurtosis defined in terms of conditional quantiles, analogous to the unconditional quantile-based measures of skewness and kurtosis studied by Kim and White (2004). We investigate the performance of our methods by simulation, and we apply MQ-CAViaR to study conditional skewness and kurtosis of S&P 500 daily returns. JEL Classification: C13, C32Asset returns, CAViaR, conditional quantiles, Dynamic quantiles, Kurtosis, Skewness

    Improving the Measurement of Core Inflation in Colombia Using Asymmetric Trimmed Means

    Get PDF
    The study evaluates the virtues of asymmetric trimmed means as efficient estimators of core inflation for Colombia, an economy with high and variable inflation rates. Results suggest that the proposed indicators are more efficient than alternative indexes and are particularly suited for environments where price change distributions are non-normal. Computations indicate that an optimally trimmed estimator for the 27-component Colombian CPI during the 1972:06 to 1997: 12 period requires that 12 percent be trimmed from the upper tail and 24 percent from the lower tail. This indicator exhibits substantially higher efficiency than the weighted average of price changes (i-e.,CPI inflation), the CPI excluding food and energy, the median and symmetric trimmmed means. These findings are robust to changes in the 36-month centered moving average benchmarck of annual inflation. the medain and symmmetric trimmed means. These findings are robust changes in the 36-month centerd moving average benchmark of annual inflation. The optimal estimator is not found to be robust to changes in data sample. This is likely due to changes in parameters of the underying data distribution due to structural changes in the Colombian economy, particularly in the post-1990 period. Optimal levels of asymmetrical trimming are also found to be highly sensitive to the degree of disaggregation of the CPI data. This is expected since greater disaggregation reveals higher kurtosis and skewness of the underying data. The CPI excluding food and energy and the median do not seem to provide persistent efficiency gaings in estimation of inflation with respect to the weighted mean. Food and energy prices are critical sectors of middle income economies such as Colombia and contain valuable information about medium and long-term trends in inflation. Finally, the median excludes many prices-especially those in the upper tail- that seem to contain valuable information about long-term trends in inflation.
    corecore