6,420 research outputs found

    The use of supply chain DEA models in operations management: A survey

    Get PDF
    Standard Data Envelopment Analysis (DEA) approach is used to evaluate the efficiency of DMUs and treats its internal structures as a “black box”. The aim of this paper is twofold. The first task is to survey and classify supply chain DEA models which investigate these internal structures. The second aim is to point out the significance of these models for the decision maker of a supply chain. We analyze the simple case of these models which is the two-stage models and a few more general models such as network DEA models. Furthermore, we study some variations of these models such as models with only intermediate measures between first and second stage and models with exogenous inputs in the second stage. We define four categories: typical, relational, network and game theoretic DEA models. We present each category along with its mathematical formulations, main applications and possible connections with other categories. Finally, we present some concluding remarks and opportunities for future research.Supply chain; Data envelopment analysis; Two-stage structures; Network structures

    Benchmarking and incentive regulation of quality of service: an application to the UK electricity distribution utilities

    Get PDF
    Quality of service has emerged as an important issue in post-reform regulation of electricity distribution networks. Regulators have employed partial incentive schemes to promote cost saving, investment efficiency, and service quality. This paper presents a quality-incorporated benchmarking study of the electricity distribution utilities in the UK between 1991/92 and 1998/99. We calculate technical efficiency of the utilities using Data Envelopment Analysis technique and productivity change over time using quality-incorporated Malmquist indices. We find that cost efficient firms do not necessarily exhibit high service quality and that efficiency scores of cost-only models do not show high correlation with those of quality-based models. The results also show that improvements in service quality have made a significant contribution to the sector’s total productivity change. In addition, we show that integrating quality of service in regulatory benchmarking is preferable to cost-only approaches.quality of service, benchmarking, incentive regulation, data envelopment analysis, electricity

    Reference Models and Incentive Regulation of Electricity Distribution Networks: An Evaluation of Sweden’s Network Performance Assessment Model (NPAM)

    Get PDF
    The world-wide electricity sector reforms have led to a search for alternative and innovative approaches to regulation to promote efficiency improvement in the natural monopoly electricity networks. A number of countries have used incentive regulation models based on efficiency benchmarking of the electricity network utilities. While most regulators have opted adopted parametric and non-parametric frontier-based methods of benchmarking some have used engineering designed ‘reference firm’ or ‘norm’ models for the purpose. This paper examines the incentive properties and other related aspects of the norm model NPAM used in regulation of distribution networks in Sweden and compares these with those of frontier-based benchmarking methods. We identify a number of important differences between the two approaches to regulation benchmarking that are not readily apparent and discuss their ramifications for the regulatory objectives and process

    The Performance of German Water Utilities: A (Semi)-Parametric Analysis

    Get PDF
    Germany's water supply industry is characterized by a multitude of utilities and widely diverging prices, possibly resulting from structural differences beyond the control of firms' management, but also from inefficiencies. In this article we use Data Envelopment Analysis and Stochastic Frontier Analysis to determine the utilities' technical efficiency scores based on cross-sectional data from 373 public and private water utilities in 2006. We find large differences in technical efficiency scores even after accounting for significant structural variables like network density, share of groundwater usage and water losses.Water supply, technical efficiency, data envelopment analysis, stochastic frontier analysis, structural variables, bootstrapped truncated regression

    Efficiency and Performance in the Gas Industry

    Get PDF
    Efficiency Performance Gas Industry

    Economies of Scope in European Railways: An Efficiency Analysis

    Get PDF
    In the course of railway reforms in the end of the last century, national European governments, as well the EU Commission, decided to open markets and to separate railway networks from train operations. Vertically integrated railway companies – companies owning a network and providing transport services – argue that such a separation of infrastructure and operations would diminish the advantages of vertical integration and would therefore not be suitable to raise economic welfare. In this paper, we conduct a pan-European analysis to investigate the performance of European railways with a particular focus on economies of vertical integration. We test the hypothesis that integrated railways realise economies of joint production and, thus, produce railway services on a higher level of efficiency. To determine whether joint or separate production is more efficient we apply a Data Envelopment Analysis super-efficiency bootstrapping model which relates the efficiency for integrated production to a virtual reference set consisting of the separated production technology. Our findings are that in a majority of European Railway companies exist economies of scope.efficiency, vertical integration, railway industry

    COOPER-framework: A Unified Standard Process for Non-parametric Projects

    Get PDF
    Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the ‘COOPER-framework’ a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly.DEA, non-parametric efficiency, unified standard process, COOPER-framework.
    • 

    corecore