4,328 research outputs found

    Nonparametric recursive aggregation process

    Get PDF
    summary:In this work we introduce a nonparametric recursive aggregation process called Multilayer Aggregation (MLA). The name refers to the fact that at each step the results from the previous one are aggregated and thus, before the final result is derived, the initial values are subjected to several layers of aggregation. Most of the conventional aggregation operators, as for instance weighted mean, combine numerical values according to a vector of weights (parameters). Alternatively, the MLA operators apply recursively over the input values a vector of aggregation operators. Consequently, a sort of unsupervised self-tuning aggregation process is induced combining the individual values in a certain fashion determined by the choice of aggregation operators

    Aggregation of predictors for nonstationary sub-linear processes and online adaptive forecasting of time varying autoregressive processes

    Full text link
    In this work, we study the problem of aggregating a finite number of predictors for nonstationary sub-linear processes. We provide oracle inequalities relying essentially on three ingredients: (1) a uniform bound of the 1\ell^1 norm of the time varying sub-linear coefficients, (2) a Lipschitz assumption on the predictors and (3) moment conditions on the noise appearing in the linear representation. Two kinds of aggregations are considered giving rise to different moment conditions on the noise and more or less sharp oracle inequalities. We apply this approach for deriving an adaptive predictor for locally stationary time varying autoregressive (TVAR) processes. It is obtained by aggregating a finite number of well chosen predictors, each of them enjoying an optimal minimax convergence rate under specific smoothness conditions on the TVAR coefficients. We show that the obtained aggregated predictor achieves a minimax rate while adapting to the unknown smoothness. To prove this result, a lower bound is established for the minimax rate of the prediction risk for the TVAR process. Numerical experiments complete this study. An important feature of this approach is that the aggregated predictor can be computed recursively and is thus applicable in an online prediction context.Comment: Published at http://dx.doi.org/10.1214/15-AOS1345 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Learning by mirror averaging

    Get PDF
    Given a finite collection of estimators or classifiers, we study the problem of model selection type aggregation, that is, we construct a new estimator or classifier, called aggregate, which is nearly as good as the best among them with respect to a given risk criterion. We define our aggregate by a simple recursive procedure which solves an auxiliary stochastic linear programming problem related to the original nonlinear one and constitutes a special case of the mirror averaging algorithm. We show that the aggregate satisfies sharp oracle inequalities under some general assumptions. The results are applied to several problems including regression, classification and density estimation.Comment: Published in at http://dx.doi.org/10.1214/07-AOS546 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Why Has U.S. Inflation Become Harder to Forecast?

    Get PDF
    Forecasts of the rate of price inflation play a central role in the formulation of monetary policy, and forecasting inflation is a key job for economists at the Federal Reserve Board. This paper examines whether this job has become harder and, to the extent that it has, what changes in the inflation process have made it so. The main finding is that the univariate inflation process is well described by an unobserved component trend-cycle model with stochastic volatility or, equivalently, an integrated moving average process with time-varying parameters; this model explains a variety of recent univariate inflation forecasting puzzles. It appears currently to be difficult for multivariate forecasts to improve on forecasts made using this time-varying univariate model.

    Does money matter in inflation forecasting?

    Get PDF
    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.Forecasting ; Inflation (Finance) ; Monetary theory

    Adapting to Unknown Smoothness by Aggregation of Thresholded Wavelet Estimators

    Get PDF
    We study the performances of an adaptive procedure based on a convex combination, with data-driven weights, of term-by-term thresholded wavelet estimators. For the bounded regression model, with random uniform design, and the nonparametric density model, we show that the resulting estimator is optimal in the minimax sense over all Besov balls under the L2L^2 risk, without any logarithm factor

    Does money matter in inflation forecasting?.

    Get PDF
    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation
    corecore