5,292 research outputs found

    Understanding Economic Change

    Get PDF

    Composite Ordinal Forecasting in Horse Racing - An Optimization Approach

    Full text link
    Using horse racing data in Hong Kong as an example, this paper looks into the properties of an optimization model for making composite ordinal forecasts based on minimization of the absolute error of the joint distribution of the errors of twelve forecasters of race outcomes. It was found that the optimization model is not only sound theoretically, but it is also robust, and can handle situations when data are sparse

    Dynamically optimal treatment allocation using Reinforcement Learning

    Full text link
    Devising guidance on how to assign individuals to treatment is an important goal in empirical research. In practice, individuals often arrive sequentially, and the planner faces various constraints such as limited budget/capacity, or borrowing constraints, or the need to place people in a queue. For instance, a governmental body may receive a budget outlay at the beginning of a year, and it may need to decide how best to allocate resources within the year to individuals who arrive sequentially. In this and other examples involving inter-temporal trade-offs, previous work on devising optimal policy rules in a static context is either not applicable, or sub-optimal. Here we show how one can use offline observational data to estimate an optimal policy rule that maximizes expected welfare in this dynamic context. We allow the class of policy rules to be restricted for legal, ethical or incentive compatibility reasons. The problem is equivalent to one of optimal control under a constrained policy class, and we exploit recent developments in Reinforcement Learning (RL) to propose an algorithm to solve this. The algorithm is easily implementable with speedups achieved through multiple RL agents learning in parallel processes. We also characterize the statistical regret from using our estimated policy rule by casting the evolution of the value function under each policy in a Partial Differential Equation (PDE) form and using the theory of viscosity solutions to PDEs. We find that the policy regret decays at a n−1/2n^{-1/2} rate in most examples; this is the same rate as in the static case.Comment: 67 page

    ABSTRACTS: CONTRIBUTED PAPERS

    Get PDF
    Teaching/Communication/Extension/Profession,

    The Life-Cycle Income Analysis Model (LIAM): a study of a flexible dynamic microsimulation modelling computing framework

    Get PDF
    This paper describes a flexible computing framework designed to create a dynamic microsimulation model, the Life-cycle Income Analysis Model (LIAM). The principle computing characteristics include the degree of modularisation, parameterisation, generalisation and robustness. The paper describes the decisions taken with regard to type of dynamic model used. The LIAM framework has been used to create a number of different microsimulation models, including an Irish dynamic cohort model, a spatial dynamic microsimulation model for Ireland, an indirect tax and consumption model for EU15 as part of EUROMOD and a prototype EU dynamic population microsimulation model for 5 EU countries. Particular consideration is given to issues of parameterisation, alignment and computational efficiency.flexible; modular; dynamic; alignment; parameterisation; computational efficiency

    Stochastic Tests on Live Cattle Steer Basis Composite Forecasts

    Get PDF
    Since the seminal papers of Bates and Granger in 1969, a superfluous amount of information has been published on combining singular forecasts. Materialized evidence has habitually demonstrated that combining the forecasts will produce the best model. Moreover, while it is possible that a best singular model could outperform a composite model, using multiple models provides the advantage of risk diversification. It has also been shown to produce a lower forecasting error. The question to whether to combine has been replaced with what amount of emphasis should be placed on each forecast. Researchers are aspired to derive optimal weights that would produce the lowest forecasting errors. An equal composite of the mean square error, by the covariance, and the best previous model, among others, have been suggested. Other academicians have suggested the use of mechanical derived weights through the use of computer programs. These weights have shown robust results. Once the composite and singular forecasts have been estimated, a systematic approach to evaluate the singular forecasts is needed. Forecasting errors, such as the root mean square error and mean absolute percentage error, are the most common criteria for elimination in both agriculture and other sectors. Although a valid mean of selection, different forecasting errors can produce a different ordinal ranking of the forecasts; thus, producing inconclusive results. These findings have promoted the inspection for other suitable candidates for forecast evaluation. At the forefront of this pursuit is stochastic dominance and stochastic efficiency. Stochastic dominance and stochastic efficiency have traditionally been used as a way to rank wealth or returns from a group of alternatives. They have been principally used in the finance and money sector as a way to evaluate investment strategies. Holt and Brandt in 1985 proposed using stochastic dominance to select between different hedging strategies. Their results suggest that stochastic dominance has the opportunity to feasibly be used in selecting the most accurate forecast. This thesis had three objectives: 1) To determine whether live cattle basis forecasting error could be reduced in comparison to singular models when using composite forecasts 2) To determine whether stochastic dominance and stochastic efficiency could be used to systematically select the most accurate forecasts 3) To determine whether currently reported forecasting error measures might lead to inaccurate conclusions in which forecast was correct. The objectives were evaluated using two primary markets, Utah and Western Kansas, and two secondary markets, Texas and Nebraska. The data for live cattle slaughter steer basis was taken and subsequently computed from the Livestock Marketing Information Center, Chicago Mercantile Exchange, and United States Department of Agriculture from 2004 to 2012. Seven singular were initially used and adapted from the current academic literature. After the models were evaluated using forecasting error, stochastic dominance and stochastic efficiency, seven composite models were created. For each separate composite model, a different weighting scheme was applied. The “optimal” composite weight, in particular, was estimated using GAMS whose objective function was to select the forecast combination that would reduce the variance-covariance between the singular forecasting models. The composite models were likewise systematically evaluated using forecasting error, stochastic dominance and stochastic efficiency. The results indicate that forecasting error can be reduced in all four markets, on the average by using an optimal weighting scheme. Optimal weighting schemes can also outperform the benchmark equal weights. Moreover, a combination of fast reaction time series and market condition, supply and demand, forecasts provide the better model. Stochastic dominance and stochastic efficiency provided confirmatory results and selected the efficient set of the forecasts over a range of risk. It likewise indicated that forecasting error may provide a point estimate rather than a range of error. Suggestions for their application and implementation into extension outlook forecasts and industry application are suggested
    • 

    corecore