244 research outputs found
Time Series Analysis, Cointegration, and Applications
The two prize winners in Economics this year would describe themselves as "Econometricians," so I thought that I should start by explaining that term. One can begin with the ancient subject of Mathematics which is largely concerned with the discovery of relationships between deterministic variables using a rigorous argument. (A deterministic variable is one whose value is known with certainty.) However, by the middle of the last millennium it became clear that some objects were not deterministic, they had to be described with the use of probabilities, so that Mathematics grew a substantial sub-field known as "Statistics." This later became involved with the analysis of data and a number of methods have been developed for data having what may be called "standard properties."time series; cointegration
Modeling Amazon Deforestation for Policy Purposes
Brazil has long ago removed most of the perverse government incentives that stimulated massive deforestation in the Amazon in the 70s and 80s, but one highly controversial policy remains: Road building. While data is now abundantly available due to the constant satellite surveillance of the Amazon, the analytical methods typically used to analyze the impact of roads on natural vegetation cover are methodologically weak and not very helpful to guide public policy. This paper discusses the respective weaknesses of typical GIS analysis and typical municipality level regression analysis, and shows what would be needed to construct an ideal model of deforestation processes. It also presents an alternative approach that is much less demanding in terms of modeling and estimation and more useful for policy makers as well.Deforestation, Amazon, Brazil, econometric modeling
Properties of Nonlinear Transformations of Fractionally Integrated Processes
This paper shows that the properties of nonlinear transformations of a fractionally integrated process depend strongly on whether the initial series is stationary or not. Transforming a stationary Gaussian I(d) process with d > 0 leads to a long-memory process with the same or a smaller long-memory parameter depending on the Hermite rank of the transformation. Any nonlinear transformation of an antipersistent Gaussian I(d) process is I(0). For non-stationary I(d) processes, every integer power transformation is non-stationary and exhibits a deterministic trend in mean and in variance. In particular, the square of a non-stationary Gaussian I(d) process still has long memory with parameter d, whereas the square of a stationary Gaussian I(d) process shows less dependence than the initial process. Simulation results for other transformations are also discussed
Advances in Supply Chain Management: Potential to Improve Forecasting Accuracy
Forecasting is a necessity almost in any operation. However, the tools of forecasting are still primitive in view of the great strides made by research and the increasing abundance of data made possible by automatic identification technologies, such as, radio frequency identification (RFID). The relationship of various parameters that may change and impact decisions are so abundant that any credible attempt to drive meaningful associations are in demand to deliver the value from acquired data. This paper proposes some modifications to adapt an advanced forecasting technique (GARCH) with the aim to develop it as a decision support tool applicable to a wide variety of operations including supply chain management. We have made an attempt to coalesce a few different ideas toward a âsolutionsâ approach aimed to model volatility and in the process, perhaps, better manage risk. It is possible that industry, governments, corporations, businesses, security organizations, consulting firms and academics with deep knowledge in one or more fields, may spend the next few decades striving to synthesize one or more models of effective modus operandi to combine these ideas with other emerging concepts, tools, technologies and standards to collectively better understand, analyze and respond to uncertainty. However, the inclination to reject deep rooted ideas based on inconclusive results from pilot projects is a detrimental trend and begs to ask the question whether one can aspire to build an elephant using mouse as a model
Recommended from our members
Economic and Statistical Measures of Forecast Accuracy
This paper argues in favour of a closer link between decision and forecast evaluation problems. Although the idea of using decision theory for forecast evaluation appears early in the dynamic stochastic programming literature, and has continued to be used in meteorological forecasts, it is hardly mentioned in standard academic textbooks on economic forecasting. Some of the main issues involved are illustrated in the context of a two-state, two-action decision problem as well as in a more general setting. Relationships between statistical and economic methods of forecast evaluation are discussed and useful links between Kuipers score, used as a measure of forecast accuracy in the meteorology literature, and the market timing tests used in finance, are established. An empirical application to the problem of stock market predictability is also provided, and the conditions under which such predictability could be exploited in the presence of transaction costs are discussed
I Didn't Run a Single Regression
Growth regression economics are haunted by the fact that results are easily overthrown by regressing alternative model specifications. Recent research therefore aims at obtaining robust regression results by systematically running multiple models and picking surviving variables. This note shows that a very popular of these approaches, the robust regression due to Sala-i-Martin (1997) very likely leads to inconsistent conclusions but may be remedied by refining the testimation algorithm. To that aim I do not need to run a single regression
- âŚ