5,034 research outputs found

    Instrumental Variable Estimators for Binary Outcomes

    Get PDF
    Instrumental variables (IVs) can be used to construct estimators of exposure effects on the outcomes of studies affected by non-ignorable selection of the exposure. Estimators which fail to adjust for the effects of non-ignorable selection will be biased and inconsistent. Such situations commonly arise in observational studies, but even randomised controlled trials can be affected by non-ignorable participant non-compliance. In this paper, we review IV estimators for studies in which the outcome is binary. Recent work on identification is interpreted using an integrated structural modelling and potential outcomes framework, within which we consider the links between different approaches developed in statistics and econometrics. The implicit assumptions required for bounding causal effects and point-identification by each estimator are highlighted and compared within our framework. Finally, the implications for practice are discussed.bounds, causal inference, generalized method of moments, local average treatment effects, marginal structural models, non-compliance, parameter identification, potential outcomes, structural mean models, structural models

    Instrumental Variable Estimators for Binary Outcomes

    Get PDF
    The estimation of exposure effects on study outcomes is almost always complicated by non-random exposure selection - even randomised controlled trials can be affected by participant non-compliance. If the selection mechanism is non-ignorable then inferences based on estimators that fail to adjust for its effects will be misleading. Potentially consistent estimators of the exposure effect can be obtained if the data are expanded to include one or more instrumental variables (IVs). An IV must satisfy core conditions constraining it to be associated with the exposure, and indirectly (but not directly) associated with the outcome through this association. Here we consider IV estimators for studies in which the outcome is represented by a binary variable. While work on this problem has been carried out in statistics and econometrics, the estimators and their associated identifying assumptions have existed in the separate domains of structural models and potential outcomes with almost no overlap. In this paper, we review and integrate the work in these areas and reassess the issues of parameter identification and estimator consistency. Identification of maximum likelihood estimators comes from strong parametric modelling assumptions, with consistency depending on these assumptions being correct. Our main focus is on three semi-parametric estimators based on the generalised method of moments, marginal structural models and structural mean models (SMM). By inspecting the identifying assumptions for each method, we show that these estimators are inconsistent even if the true model generating the data is simple, and argue that this implies that consistency is obtained only under implausible conditions. Identification for SMMs can also be obtained under strong exposure-restricting design constraints that are often appropriate for randomised controlled trials, but not for observational studies. Finally, while estimation of local causal parameters is possible if the selection mechanism is monotonic, not all SMMs identify a local parameter.Econometrics, Generalized methods of moments, Parameter identification, Marginal structural models, Structural mean models, Structural models

    Price flexibility and full employment: a common misconception

    Get PDF
    This paper highlights and builds upon Michio Morishima’s sadly neglected thesis that multi-market economies should be envisaged, and modelled, as over-determined systems, in that the number of conditions to be satisfied for equilibrium exceeds the number of unknowns (equilibrium prices and quantities) to be discovered. This understanding undermines the comfortable supposition (underpinning both New Keynesian and New Classical theoretical approaches) that, even when the economy is not in a position of full employment, a potential equilibrium solution does exist which - if not instantly, at least eventually – will be achieved by market forces. In other words, contrary to the conventional view, observed price and wage stickiness should be considered as contributing to macroeconomic stability rather than inhibiting adjustment to full employment equilibrium. A further casualty of the Morishima perspective is the common textbook rationalisation that the Keynes theory applies only in the short run (with sticky prices) while the classical analysis comes into its own (with flexible prices) in the longer term.Price flexibility; General equilibrium (macro) models; Walras' Law and Say's Law; Over-determined systems

    Fractional integration and cointegration in US financial time series data

    Get PDF
    This paper examines several US monthly financial time series data using fractional integration and cointegration techniques. The univariate analysis based on fractional integration aims to determine whether the series are I(1) (in which case markets might be efficient) or alternatively I(d) with d < 1, which implies mean reversion. The multivariate framework exploiting recent developments in fractional cointegration allows to investigate in greater depth the relationships between financial series. We show that there exist many (fractionally) cointegrated bivariate relationships among the variables examined

    Smallholders’ Cost Efficiency in Mozambique: Implications for Improved Maize Seed Adoption

    Get PDF
    The objectives of this paper are to estimate cost efficiency and investigate factors influencing the cost efficiency of maize-growing smallholders in Mozambique. The data used in this study came from a national random sample of 4,908 smallholder farmers conducted by the Ministry of Agriculture and Rural Development in 2002. Stochastic cost frontier and self-selection bias methods are used. The results indicate that twelve out of twenty factors are significantly found to be the determining factors influencing the cost efficiency. To enhance the cost efficiency of producing maize, policy makers should put more emphasis on improving rural infrastructures, providing better education, and providing access to credit.Crop Production/Industries, Farm Management,

    Moment Inequalities in the Context of Simulated and Predicted Variables

    Full text link
    This paper explores the effects of simulated moments on the performance of inference methods based on moment inequalities. Commonly used confidence sets for parameters are level sets of criterion functions whose boundary points may depend on sample moments in an irregular manner. Due to this feature, simulation errors can affect the performance of inference in non-standard ways. In particular, a (first-order) bias due to the simulation errors may remain in the estimated boundary of the confidence set. We demonstrate, through Monte Carlo experiments, that simulation errors can significantly reduce the coverage probabilities of confidence sets in small samples. The size distortion is particularly severe when the number of inequality restrictions is large. These results highlight the danger of ignoring the sampling variations due to the simulation errors in moment inequality models. Similar issues arise when using predicted variables in moment inequalities models. We propose a method for properly correcting for these variations based on regularizing the intersection of moments in parameter space, and we show that our proposed method performs well theoretically and in practice

    From Measurement to Management: the Influence of IT on Service Operations

    Get PDF
    The state of service management practice and the developments in IT-efficiency research prompt the call for managerial relevance, normative theory building and the conceptualization and measurement of the impact of Information Technology (IT) on service efficiency. Drawing on theoretical insights from economic and behavioral literature, this article deduces a work system centered model of the service outlet and proposes a measurement methodology (ITIMPACT) geared towards the development of a business intelligence tool. The measurement follows a two-step methodology that first assesses compared-to-best efficiency, using Data Envelopment Analysis (DEA), and subsequently explains efficiency differences using a regression framework. An inter-disciplinary approach bases the first step on econometric logic, while the second takes its foundation in behavioral sciences, and information system research.Service industry;

    Property Taxation of Multifamily Housing: An Empirical Analysis of Vertical and Horizontal Equity and Assessment Methods

    Get PDF
    This study used the hedonic price technique to focus on a housing characteristic that has been studied infrequently: whether a home is site-built or manufactured. Two hedonic price regression models were used to determine the predictive power of construction type on home price. The ?rst, which controlled for factors found to relate to home prices in previous research, showed a signi?cant difference between the prices of the two types of homes. The second, which also included other variables through a stepwise regression, found that the type of construction had more predictive power than any other explanatory variable in the model.
    corecore