910,898 research outputs found

    Automated ANN alerts : one step ahead with mobile support

    Get PDF
    In this paper, I examine the potential of mobile alerting services empowering investors to react quickly to critical market events. Therefore, an analysis of short-term (intraday) price effects is performed. I find abnormal returns to company announcements which are completed within a timeframe of minutes. To make use of these findings, these price effects are predicted using pre-defined external metrics and different estimation methodologies. Compared to previous research, the results provide support that artificial neural networks and multiple linear regression are good estimation models for forecasting price effects also on an intraday basis. As most of the price effect magnitude and effect delay can be estimated correctly, it is demonstrated how a suitable mobile alerting service combining a low level of user-intrusiveness and timely information supply can be designed

    One-step-ahead kinematic compressive sensing

    Get PDF
    A large portion of work on compressive sampling and sensing has focused on reconstructions from a given measurement set. When the individual samples are expensive and optional, as is the case with autonomous agents operating in a physical domain and under specific energy limits, the CS problem takes on a new aspect because the projection is column-sparse, and the number of samples is not necessarily large. As a result, random sampling may no longer be the best tactic. The underlying incoherence properties in l0 reconstruction, however, can still motivate the purposeful design of samples in planning for CS with one or more agents; we develop here a greedy and computationally tractable sampling rule that will improve errors relative to random points. Several example cases illustrate that the approach is effective and robust.United States. Office of Naval Research (Grant N00014-09-1-0700

    A note on prediction and interpolation errors in time series

    Get PDF
    In this note we analyze the relationship between one-step ahead prediction errors and interpolation errors in time series. We obtain an expression of the prediction errors in terms of the interpolation errors and then we show that minimizing the sum of squares of the one step-ahead standardized prediction errors is equivalent to minimizing the sum of squares of standardized interpolation errors

    Gaussian Process priors with uncertain inputs? Application to multiple-step ahead time series forecasting

    Get PDF
    We consider the problem of multi-step ahead prediction in time series analysis using the non-parametric Gaussian process model. k-step ahead forecasting of a discrete-time non-linear dynamic system can be performed by doing repeated one-step ahead predictions. For a state-space model of the form y t = f(Yt-1 ,..., Yt-L ), the prediction of y at time t + k is based on the point estimates of the previous outputs. In this paper, we show how, using an analytical Gaussian approximation, we can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction

    On the Bass diffusion theory, empirical models and out-of-sample forecasting

    Get PDF
    The Bass (1969) diffusion theory often guides the construction of forecasting models for new product diffusion. To match the model with data, one needs to put forward a statistical model. This paper compares four empirical versions of the model, where two of these explicitly incorporate autoregressive dynamics. Next, it is shown that some of the regression models imply multi-step ahead forecasts that are biased. Therefore, one better relies on the simulation methods, which are put forward in this paper. An empirical analysis of twelve series (Van den Bulte and Lilien 1997) indicates that one-step ahead forecasts substantially improve by including autoregressive terms and that simulated two-step ahead forecasts are quite accurate.forecasting;diffusion

    Keeping One Step Ahead

    Get PDF

    One-step-ahead implementation

    Get PDF
    In many situations, agents are involved in an allocation problem that is followed by another allocation problem whose optimal solution depends on how the former problem has been solved. In this paper, we take this dynamic structure of allocation problems as an institutional constraint. By assuming a finite number of allocation problems, one for each period/stage, and by assuming that all agents in society are involved in each allocation problem, a dynamic mechanism is a period-by-period process. This process generates at any period- history a period- mechanism with observable actions and simultaneous moves. We also assume that the objectives that a planner wants to achieve are summarized in a social choice function (SCF), which maps each state (of the world) into a period-by-period outcome process. In each period , this process selects for each state a period- socially optimal outcome conditional on the complete outcome history realized up to period . Heuristically, the SCF is one-step-ahead implementable if there exists a dynamic mechanism such that for each state and each realized period- history, each of its subgame perfect Nash equilibria generates a period-by-period outcome process that coincides with the period-by-period outcome process that the SCF generates at that state from period onwards. We identify a necessary condition for SCFs to be one-step-ahead implemented, one-step-ahead Maskin monotonicity, and show that it is also sufficient under a variant of the condition of no veto-power when there are three or more agents. Finally, we provide an account of welfare implications of one-step-ahead implementability in the contexts of trading decisions and voting problems

    A NOTE ON PREDICTION AND INTERPOLATION ERRORS IN TIME SERIES

    Get PDF
    In this note we analyze the relationship between one-step ahead prediction errors and interpolation errors in time series. We obtain an expression of the prediction errors in terms of the interpolation errors and then we show that minimizing the sum of squares of the one step-ahead standardized prediction errors is equivalent to minimizing the sum of squares of standardized interpolation errors.
    corecore