1,421 research outputs found

    Scenario-based forecast for the electricity demand in Qatar and the role of energy efficiency improvements

    Get PDF
    We model the electricity consumption in the market segment that compose the Qatari electricity market. We link electricity consumption to GDP growth and Population Growth. Building on the estimated model, we develop long-range forecasts of electricity consumption from 2017 to 2030 over different scenarios for the economic drivers. In addition, we proxy for electricity efficiency improvements by reducing the long-run elasticity of electricity consumption to GDP and Population. We show that electricity efficiency has a crucial role in controlling the future development of electricity consumption. Energy policies should consider this aspect and support both electricity efficiency improvement programs, as well as a price reform

    Growth Rates Preservation (GRP) temporal benchmarking: Drawbacks and alternative solutions

    Get PDF
    Benchmarking monthly or quarterly series to annual data is a common practice in many National Statistical Institutes. The benchmarking problem arises when time series data for the same target variable are measured at different frequencies and there is a need to remove discrepancies between the sums of the sub-annual values and their annual benchmarks. Several benchmarking methods are available in the literature. The Growth Rates Preservation (GRP) benchmarking procedure is often considered the best method. It is often claimed that this procedure is grounded on an ideal movement preservation principle. However, we show that there are important drawbacks to GRP, relevant for practical applications, that are unknown in the literature. Alternative benchmarking models will be considered that do not suffer from some of GRP\u2019s side effects

    Our products are safe (don't tell anyone!). Why don't supermarkets advertise their private food safety standards?

    Get PDF
    Large retail chains have spent considerable resources to promote production protocols and traceability across the supply chain, aiming at increasing food safety. Yet, the majority of consumers are unaware of these private food safety standards (PFSS) and retailers are not informing them. This behavior denotes a pooling paradox: supermarkets spend a large amount of money for food safety and yet they forget to inform consumers. The result is a pooling equilibrium where consumers cannot discriminate among high quality and low quality products and supermarkets give up the potential price premium. This paper provides an economic explanation for the paradox using a contract-theory model. We found that PFSS implementation may be rational even if consumers have no willingness to pay for safety, because the standard can be used as a tool to solve asymmetric information along the supply chain. Using the PFSS, supermarkets can achieve a separating equilibrium where opportunistic suppliers have no incentive to accept the contract. Even if consumers exhibit a limited (but strictly positive) willingness to pay for safety, advertising may be profit-reducing. If the expected price margin is high enough, supermarkets have incentive to supply both certified and uncertified products. In this case, we show that, if consumers perceive undifferentiated products as “reasonably safe”, supermarkets may maximize profits by pooling the goods and selling them as undifferentiated. This result is not driven by advertising costs, as we derive it assuming free advertising.Agribusiness, Food Consumption/Nutrition/Food Safety,

    New Procedures for the Reconciliation of Time Series

    Get PDF
    We propose new simultaneous and two-step procedures for reconciling systems of time series subject to temporal and contemporaneous constraints according to a Growth Rates Preservation (GRP) principle. Two nonlinear optimization algorithms are used: an interior-point method applied to the constrained problem and a Newton’s method with Hessian modification applied to a suitably reduced-unconstrained problem. Both techniques exploit the analytic gradient and Hessian of the GRP objective function, making full use of all the derivative information at disposal. We apply the proposed GRP procedures to two large systems of economic series, and compare the results with those of other reconciliation procedures based on the Proportional First Differences (PFD) principle, a linear approximation of the GRP principle widely used by data-producing agencies. Our experiments show that (i) an optimal solution to the nonlinear GRP problem can be efficiently achieved through the proposed Newton’s optimization algorithms, and (ii) GRP-based procedures preserve better the growth rates in the system than linear PFD solutions, especially for series with high temporal discrepancy and high volatility

    The Enhanced Denton's Benchmarking Procedure for Extrapolation. Matrix Formulation and Practical Issues.

    Get PDF
    Statistical agencies make often recourse to benchmarking to estimate Quarterly National Accounts. One of the most used benchmarking procedure is the one based on the Proportional First DiÂźerences (PFD) criterion, according to the original proposal by Denton (1971). Despite its good properties for distribution, this method might give unsatisfactory results in the extrapolation of quarters of the current year. An enhanced version of the Denton PFD method has been suggested by the International Monetary Fund (IMF) to improve forecasting accuracy. In this paper we provide a matrix formalization of the enhanced solution, and analyze its properties through artificial data. Finally, we critically review the shortcut version of the enhanced method proposed by the IMF, which is currently in use in some statistical agencies

    Statistical Reconciliation of Time Series. Movement Preservation vs. a Data Based Procedure.

    Get PDF
    Most of the data obtained by statistical agencies have to be adjusted, corrected or somehow processed by statisticians in order to arrive at useful, consistent and publishable values. When temporally and contemporaneously aggregated series are known, temporal (e.g., between quarterly and annual data) and contemporaneous (between the quarterly aggregate and the sum of its component series) discrepancies can be eliminated using various reconciliation procedures. In this paper we consider (i) an extension of the univariate benchmarking approach by Denton (1971), founded on a well known movement preservation principle, and (ii) a data-based benchmarking procedure (Guerrero and Nieto, 1999) which exploits the autoregressive features of the preliminary series to be adjusted. In order to evaluate their performance in practical situations, both procedures are applied to simulated and real world data

    Cross-temporal forecast reconciliation: Optimal combination method and heuristic alternatives

    Get PDF
    Forecast reconciliation is a post-forecasting process aimed to improve the quality of the base forecasts for a system of hierarchical/grouped time series (Hyndman et al., 2011). Contemporaneous (cross-sectional) and temporal hierarchies have been considered in the literature, but - except for Kourentzes and Athanasopoulos (2019) - generally these two features have not been fully considered together. Adopting a notation able to simultaneously deal with both forecast reconciliation dimensions, the paper shows two new results: (i) an iterative cross-temporal forecast reconciliation procedure which extends, and overcomes some weaknesses of, the two-step procedure by Kourentzes and Athanasopoulos (2019), and (ii) the closed-form expression of the optimal (in least squares sense) point forecasts which fulfill both contemporaneous and temporal constraints. The feasibility of the proposed procedures, along with first evaluations of their performance as compared to the most performing `single dimension' (either cross-sectional or temporal) forecast reconciliation procedures, is studied through a forecasting experiment on the 95 quarterly time series of the Australian GDP from Income and Expenditure sides considered by Athanasopoulos et al. (2019).Comment: Main text: 49 pages, 10 figures, 2 tables. Appendix: 68 pages, 29 figures, 17 table

    A Newton's Method for Benchmarking Time Series according to a Growth Rates Preservation Principle

    Get PDF
    We present a new technique for temporally benchmarking a time series according to the Growth Rates Preservation (GRP) principle by Causey and Trager (1981). This procedure basically looks for the solution to a non linear program, according to which f(x), a smooth, non-convex function of the unknown values of the target time series xt, t = 1, . . . , n, has to be minimized subject to linear equality constraints which link the more frequent series xt to a given, less frequent benchmark series bT, T = 1, . . . ,m. We develop a Newton's method with Hessian modification applied to a suitably reducedunconstrained problem. This method exploits the analytic Hessian of the GRPobjective function, making full use of all the derivative information at disposal. We show that the proposed technique is easy to implement, computationally robust and efficient, all features which make it a plausible competitor of other benchmarking procedures (Denton, 1971; Dagum and Cholette, 2006) also in a data-production process involving a considerable amount of series

    Simultaneous and Two-step Reconciliation of Systems of Time Series.

    Get PDF
    The reconciliation of systems of time series subject to both temporal and contemporaneous constraints can be solved in such a way that the temporal profiles of the original series be preserved “at the best” (movement preservation principle). Thanks to the sparsity of the linear system to be solved, a feasible procedure can be developed to solve simultaneously the problem. A two-step strategy might be more suitable in the case of large systems: firstly, each series is aligned to the corresponding temporal constraints according to a movement preservation principle; secondly, all series are reconciled within each low-frequency period according to the given constraints. This work compares the results of simultaneous and two-step approaches for medium/large datasets from real-life and discusses conditions under which the two-step procedure can be a valid alternative to the simultaneous one

    Special Issue - Analisi e test preliminari sull\u27accessibilit? nei dispositivi mobili

    Get PDF
    .L\u27interazione con i dispositivi mobili non ? pi? univoca, ma pu? variare secondo il profilo d\u27utente, il dispositivo o il contesto nel quale operiamo. Particolare importanza assume questo aspetto per le persone con limitazioni di abilit?. Essi devono interagire con i propri dispositivi mobile usando modalit? alternative. Questi utenti possono avere limitazioni visive, uditive, fisiche o legate all\u27et?, che impediscono loro l\u27accesso quando questo ? fornito solo in modalit? predefinite quali quella grafica. Lo scopo di questo lavoro ? quello di fornire una panoramica sull\u27accessibilit? degli attuali dispositivi mobili, sia per fornire agli utenti degli elementi di indirizzo d\u27uso, sia per analizzare quali soluzioni sono state crete e applicate per l\u27accessibilit? di questi nuovi dispositivi, sia per fornire almeno un cenno alle eventuali barriere ancora esistenti. Tra i vari criteri possibili utilizzabili per classificare e quindi descrivere la situazione, la scelta ? caduta sulle distinzioni esistenti in base al sistema operativo. Altri criteri potevano essere adottati, quali, ad esempio, il tipo di dispositivo. Tuttavia poich? dispositivi diversi, come tablet o smarphone, si comportano in modo simile se supportati dallo stesso sistema operativo, si ? preferito partire dal software di base. Cambiano le dimensioni fisiche del dispositivo, ma non i criteri utilizzati per assicurare l\u27accessibilit?
    • 

    corecore