100 research outputs found

    A note on the Gao et al. (2019) uniform mixture model in the case of regression

    Get PDF
    © 2019, The Author(s). We extend the uniform mixture model of Gao et al. (Ann Oper Res, 2019. https://doi.org/10.1007/s10479-019-03236-9) to the case of linear regression. Gao et al. (Ann Oper Res, 2019. https://doi.org/10.1007/s10479-019-03236-9) proposed that to characterize the probability distributions of multimodal and irregular data observed in engineering, a uniform mixture model can be used. This model is a weighted combination of multiple uniform distribution components. This case is of empirical interest since, in many instances, the distribution of the error term in a linear regression model cannot be assumed unimodal. Bayesian methods of inference organized around Markov chain Monte Carlo are proposed. In a Monte Carlo experiment, significant efficiency gains are found in comparison to least squares justifying the use of the uniform mixture model

    Convex Non-Parametric Least Squares, Causal Structures and Productivity

    Get PDF
    In this paper we consider Convex Nonparametric Least Squares (CNLS) when productivity is introduced. In modern treatments of production function estimation, the issue has gained great importance as when productivity shocks are known to the producers, input choices are endogenous and estimators of production function parameters become inconsistent. As CNLS has excellent properties in terms of approximating arbitrary monotone concave functions, we use it, along with flexible formulations of productivity, to estimate inefficiency and productivity growth in Chilean manufacturing plants. Inefficiency and productivity dynamics are explored in some detail along with marginal effects of contextual variables on productivity growth, inputs, and output. Additionally, we examine the causal structure between inefficiency and productivity as well as model validity based on a causal deconfounding approach. Unlike the Cobb-Douglas and translog production functions, the CNLS system is found to admit a causal interpretation

    Performance estimation when the distribution of inefficiency is unknown

    Get PDF
    We show how to compute inefficiency or performance scores when the distribution of the one-sided error component in Stochastic Frontier Models (SFMs) is unknown; and we do the same with Data Envelopment Analysis (DEA). Our procedure, which is based on the Fast Fourier Transform (FFT), utilizes the empirical characteristic function of the residuals in SFMs or efficiency scores in DEA. The new techniques perform well in Monte Carlo experiments and deliver reasonable results in an empirical application to large U.S. banks. In both cases, deconvolution of DEA scores with the FFT brings the results much closer to the inefficiency estimates from SFM

    Multi-criteria optimization in regression

    Get PDF
    In this paper, we consider standard as well as instrumental variables regression. Specification problems related to autocorrelation, heteroskedasticity, neglected non-linearity, unsatisfactory out-of-small performance and endogeneity can be addressed in the context of multi-criteria optimization. The new technique performs well, it minimizes all these problems simultaneously, and eliminates them for the most part. Markov Chain Monte Carlo techniques are used to perform the computations. An empirical application to NASDAQ returns is provided

    COVID-19 and gradual adjustment in the tourism, hospitality, and related industries

    Get PDF
    In this Note, we take up the problem of post COVID-19 gradual adjustment in the industry as well as whether and to what extent opening with limited capacity might be feasible. We find that (i) re-opening gradually requiring only non-negative profits is quite feasible but (ii) re-opening requiring the same level of profit as in the pre COVID-19 period is considerably more difficult, and seems feasible by re-opening at capacity near 33%. Lower capacities would require governmental subsidies which could vary considerably from hotel to hotel

    An Empirical Model of Behavioral Heterogeneity

    Get PDF
    In this paper we propose a new latent class/mixture model (LCM) to determine whether firms behave like profit maximizers or just cost minimizers when there is no additional sample separation information. Since some firms might be maximizing profit while others might minimize cost, the LCM with behavioral heterogeneity can be quite useful. Estimation of the LCM amounts to mixing a cost minimization and a profit maximization model. Using the U.S. airlines data we find that after deregulation about 15% of the airlines are found to be consistent with profit maximizing behavior.

    Bayesian local influence analysis:With an application to stochastic frontiers

    Get PDF
    A Bayesian alternative to Zhuo (2018) is presented. The method is of general interest as it presents an explicit formula for the local sensitivity of log marginal likelihood when observations vary by a small amount. The remarkable feature is that the formula is very easy to compute and does not require knowledge of the marginal likelihood which is, invariably, extremely difficult to compute. Similar expressions are derived for posterior moments and other functions of interest, including inefficiency. Methods for examining prior sensitivity in a straightforward way are also presented. The methods are illustrated in the context of a stochastic production frontier

    A generalized inefficiency model with input and output dependence

    Get PDF
    In this paper we propose a general inefficiency model, in the sense that technical inefficiency is, simultaneously, a function of all inputs, outputs, and contextual variables. We recognize that change in inefficiency is endogenous or rational, and we propose an adjustment costs model with firm-specific but unknown adjustment cost parameters. When inefficiency depends on inputs and outputs, the firm's optimization problem changes as the first order conditions must take into account the dependence of inefficiency on the endogenous variables of the problem. The new formulation introduces statistical challenges which are successfully resolved. The model is estimated using Maximum Simulated Likelihood and an empirical application to U.S. banking is provided

    Robust Bayesian Inference in Stochastic Frontier Models

    Get PDF
    We use the concept of coarsened posteriors to provide robust Bayesian inference via coarsening in order to robustify posteriors arising from stochastic frontier models. These posteriors arise from tempered versions of the likelihood when at most a pre-specified amount of data is used, and are robust to changes in the model. Specifically, we examine robustness to changes in the distribution of the composed error in the stochastic frontier model (SFM). Moreover, coarsening is a form of regularization, reduces overfitting and makes inferences less sensitive to model choice. The new techniques are illustrated using artificial data as well as in a substantive application to large U.S. bank
    • …
    corecore