91 research outputs found
Australia's Cash Economy: Are the estimates credible?
The method of "excess sensitivity" of Bajada (1999, 2001, 2002) indicates a large underground economy in Australia, with estimates of unrecorded income around 15 per cent of official GDP. These estimates concern policymakers, especially those agencies responsible for national accounts, tax collection, economic stabilization and law enforcement. We show that the method exhibits a severe form of non-robustness, in which the results change markedly with a simple change in the units of measurement of the variables. There is a separate problem in which a key parameter is set to an unrealistic value that makes the estimates many times too high.underground economy, currency demand, tax evasion, econometric models
Estimating the Underground Economy using MIMIC Models
MIMIC models are being used to estimate the size of the underground economy or the tax gap in various countries. In this paper I examine critically both the method in general and three applications of the method by Giles and Tedds (2002), Bajada and Schneider (2005) and Dell’Anno and Schneider (2003). Connections are shown to familiar econometric models of linear regression and simultaneous equations. I also investigate the auxiliary procedures used in this literature, including differencing as a treatment for unit roots and the calibration of results using other data. The three applications demonstrate how the method is subjective and pliable in practice. I conclude that the MIMIC method is unfit for the purpose.underground economy, MIMIC, structural modelling, LISREL® software
Global Temperature Trends
Are global temperatures on a warming trend? It is difficult to be certain about trends when there is so much variation in the data and very high correlation from year to year. We investigate the question using statistical time series methods. Our analysis shows that the upward movement over the last 130-160 years is persistent and not explained by the high correlation, so it is best described as a trend. The warming trend becomes steeper after the mid-1970s, but there is no significant evidence for a break in trend in the late 1990s. Viewed from the perspective of 30 or 50 years ago, the temperatures recorded in most of the last decade lie above the confidence band of forecasts produced by a model that does not allow for a warming trend.Land and ocean temperatures; deterministic and stochastic trends; persistence; piecewise linear trends
Australia’s underground economy – redux?
Bajada (2006) recognises that his earlier books and papers used a faulty method for measuring the underground economy in Australia. He also reports finding a new “more serious problem” in the method. All of these failures can be avoided, it is claimed, by reduced use of currency modelling and more reliance on outside estimates. Despite delivering estimates up to two-thirds less than before, the revised method involves substantial double counting. Ironically, these problems are found only in Bajada’s particular method, not in currency modelling generally
Australia’s underground economy – redux?
Bajada (2006) recognises that his earlier books and papers used a faulty method for measuring the underground economy in Australia. He also reports finding a new “more serious problem” in the method. All of these failures can be avoided, it is claimed, by reduced use of currency modelling and more reliance on outside estimates. Despite delivering estimates up to two-thirds less than before, the revised method involves substantial double counting. Ironically, these problems are found only in Bajada’s particular method, not in currency modelling generally
Some aspects of statistical inference for econometrics
This thesis is concerned with examining relationships among the
various asymptotic hypothesis testing principles in econometric settings
and with developing applications of the Lagrange multiplier (LM)
procedure to econometric problems. For a wide range of hypothesis
testing situations, particularly those associated with detecting misspecification
errors in regression models, it is argued that the LM
method is most useful. The LM test, which is asymptotically equivalent
to the likelihood ratio test in regular problems, is frequently less
demanding computationally than other procedures that might be applied
in the same circumstances. In addition, the LM statistic sometimes
corresponds to a criterion which is familiar to the econometrician but
which has been previously motivated by other considerations. The LM
testing principle provides a convenient framework in which such existing
tests can be extended and new tests can be developed.
Chapter 1 sketches the theoretical setting that is applicable to
many statistical problems in econometrics and highlights a number of
aspects of the various testing principles, for reference in later chapters
Tests of coefficient restrictions in linear regression models are considered
in Chapter 2, including an examination of a systematic numerical
inequality relationship among the criteria. Chapter 3 is concerned with
the LM test in its various guises and with applicability of the LM
method to diverse econometric situations. Specific applications are
considered in greater detail in Chapters 4 through 6: in Chapter 4 the
LM method is applied to testing for autocorrelation in dynamic single
equation linear models; in Chapter 5 the ideas of the preceding chapter
are extended to simultaneous equations systems, and in Chapter 6 a test
against a wide class of heteroscedastic disturbance formulations is developed. Since the theoretical properties of the LM test derive
mainly from asymptotic considerations, questions regarding the validity
of asymptotic results to practical situations with finite sample sizes
remain open. A Monte Carlo simulation study, comparing the LM test
for heteroscedasticity with other asymptotically equivalent tests, is
presented in Chapter 7
On the fixed-effects vector decomposition
This paper analyses the properties of the fixed-effects vector decomposition estimator, an emerging and popular technique for estimating time-invariant variables in panel data models with unit effects. This estimator was initially motivated on heuristic grounds, and advocated on the strength of favorable Monte Carlo results, but with no formal analysis. We show that the three-stage procedure of this decomposition is equivalent to a standard instrumental variables approach, for a specific set of instruments. The instrumental variables representation facilitates the present formal analysis which finds: (1) The estimator reproduces exactly classical fixed-effects estimates for time-varying variables. (2) The standard errors recommended for this estimator are too small for both time-varying and time-invariant variables. (3) The estimator is inconsistent when the time-invariant variables are endogenous. (4) The reported sampling properties in the original Monte Carlo evidence are incorrect. (5) We recommend an alternative shrinkage estimator that has superior risk properties to the decomposition estimator, unless the endogeneity problem is known to be small or no relevant instruments exist
The Relationship between Expected Inflation, Disagreement, and Uncertainty: Evidence from Matched Point and Density Forecasts
More evidence on the puzzle of interindustry wage differentials: The case of West Germany
- …