1,539 research outputs found

    COOPER-framework: A Unified Standard Process for Non-parametric Projects

    Get PDF
    Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the ‘COOPER-framework’ a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly.DEA, non-parametric efficiency, unified standard process, COOPER-framework.

    Productivity Changes and Risk Management in Indonesian Banking: An Application of a New Approach to Constructing Malmquist Indices

    Get PDF
    In this study, we utilise a new, non-parametric efficiency measurement approach which combines the semi-oriented radial measure data envelopment analysis (SORM-SBM-DEA) approach for dealing with negative data (Emrouznejad et al., 2010) with the slacks-based efficiency measures of Tone (2001, 2002) to analyse productivity changes for Indonesian banks over the period Quarter I 2003 to Quarter II 2007. Having constructed the Malmquist indices, using data provided by Bank Indonesia (the Indonesian central bank), for the banking industry and different bank types (i.e., listed and Islamic) and groupings, we then decomposed the industry’s Malmquist into its technical efficiency change and frontier shift components. Finally, we analysed the banks’ risk management performance, using Simar and Wilson’s (2007) truncated regression approach, before assessing its impact on productivity growth. The first part of the Malmquist analysis showed that average productivity changes for the Indonesian banking industry tended to be driven, over the sample period, by technological progress rather than by frontier shift, although a relatively stable pattern was exhibited for most of the period. However, at the beginning of the considered period, state-owned and foreign banks, as well as Islamic banks, exhibited volatile productivity movements, mainly caused by shifts in the technological frontier. With respect to the risk management analysis, most of the balance sheet variables were shown to have had the expected impact on risk management efficiency. While the risk management decomposition of technical efficiency change and frontier risk components demonstrated that, by the end of the sample period, the change in risk management efficiency and risk management effects had the same dynamic pattern, resulting in the analogous dynamics for technical efficiency changes. Therefore, a strategy based on the gradual adoption of newer technology, with a particular focus on internal risk management enhancement, seems to offer the highest potential for boosting the productivity of the financial intermediary operations of Indonesian banks.Indonesian Finance and Banking; Productivity; Efficiency.

    Far out or alone in the crowd: Classification of selfevaluators in DEA

    Get PDF
    The units found strongly efficient in DEA studies on efficiency can be divided into self-evaluators and active peers, depending on whether the peers are referencing any inefficient units or not. The contribution of the paper starts with subdividing the selfevaluators into interior and exterior ones. The exterior self-evaluators are efficient “by default”; there is no firm evidence from observations for the classification. These units should therefore not been regarded as efficient, and be removed from the observations on efficiency scores when performing a two-stage analysis of explaining the distribution of the scores. A method for classifying self-evaluators based on the additive DEA model is developed. The application to municipal nursing- and home care services of Norway shows significant effects of removing exterior self-evaluators from the data when doing a two-stage analysis.Self-evaluator; interior and exterior self-evaluator; DEA; efficiency; referencing zone; nursing homes

    A New Approach to Dealing With Negative Numbers in Efficiency Analysis: An Application to the Indonesian Banking Sector

    Get PDF
    In one of the first stand-alone studies covering the whole of the Indonesian banking industry, and utilising a unique dataset provided by the Indonesian central bank, this paper analyses the levels of intermediation-based efficiency obtaining during the period 2003-2007. Using a new approach (i.e., semi-oriented radial measure Data Envelopment Analysis, or ‘SORM DEA’) to handling negative numbers (Emrouznejad et al., 2010) and combining it with Tone’s (2001) slacks-based model (SBM) to form an input-oriented, non-parametric SORM SBM model, we firstly estimate the relative average efficiencies of Indonesian banks, both overall, by group, as determined by their ownership structure, and by status (‘listed’/’Islamic’). For robustness, a range-directional (RD) model suggested by Silva Portela et al. (2004) was also employed to handle the negative numbers. In the second part of the analysis, we adopt Simar and Wilson’s (2007) bootstrapping methodology to formally test for the impact of size, ownership structure and status on Indonesian bank efficiency. In addition, we formally test the two models most widely suggested in the literature for controlling for bank risk – namely, those involving the inclusion of provisions for loan losses and equity capital respectively as inputs – to check the robustness of the results to the choice of risk variable. The results demonstrate a high degree of sensitivity of the average bank efficiency scores to the choice of methodology for handling negative numbers – with the RD model consistently delivering efficiency scores some 14% on average above those from the SORM SBM model – and to the choice of risk control variable under the RD model, but only a limited sensitivity to the choice of risk control variable under the SORM SBM model. With respect to group rankings, most model combinations find the ‘state-owned’ group to be the most efficient, with average overall efficiency levels ranging between 64% and 97%; while all model combinations find the ‘regional government-owned’ group to be the least efficient, with average overall efficiency levels ranging between 41% and 64%. As for the impact of bank ‘status’ on the efficiency scores, both the Islamic banks and the listed banks perform better than the industry average in the majority of model combinations. Finally, the results for the impact of scale on the efficiency scores are ambiguous. Under the RD model, and irrespective of the choice of risk control variable, size is very important in determining intermediation-based efficiency. Under the SORM SBM model, however, large banks’ performance is not significantly different from that of the medium-sized banks when equity capital is used as the risk control variable, although the medium-sized banks do out-perform small banks. Moreover, when loan loss provisions are used as the risk control variable, medium-sized banks are shown to significantly out-perform both large and small banks, with the large banks being the least efficient.Indonesian Finance and Banking; Efficiency.

    Measuring efficiency of Tunisian schools in the presence of quasi-fixed inputs: A bootstrap data envelopment analysis approach

    Get PDF
    The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical Data Envelopment Analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the Data Generation Process (DGP). The DGP is constructed such that we can implement both smooth homogeneous and heterogeneous bootstrap methods. Bootstrap simulations were used to estimate and correct the bias, and to construct confidence intervals for the efficiency measures. The simulation results show that the efficiency measures are subject to sampling variations. The adjusted measure reveals that high schools with residence services would have to give up less than 12.1 percent of their resources on average to be efficient.Educational economics; Efficiency; Productivity; Data Envelopment Analysis; Bootstrap; Quasi-fixed inputs

    COOPER-framework:a unified process for non-parametric projects

    Get PDF
    Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the 'COOPER-framework' a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly. © 2010 Elsevier B.V. All rights reserved

    Environmental Factors Affecting Hong Kong Banking: A Post-Asian Financial Crisis Efficiency Analysis

    Get PDF
    Within the banking efficiency analysis literature there is a dearth of studies which have considered how banks have ‘survived’ the Asian financial crisis of the late 1990s. Considering the profound changes that have occurred in the region’s financial systems since then, such an analysis is both timely and warranted. This paper examines the evolution of Hong Kong’s banking industry’s efficiency and its macroeconomic determinants through the prism of two alternative approaches to banking production based on the intermediation and services-producing goals of bank management over the post-crisis period. Within this research strategy we employ Tone’s (2001) Slacks-Based Model (SBM) combining it with recent bootstrapping techniques, namely the non-parametric truncated regression analysis suggested by Simar and Wilson (2007) and Simar and Zelenyuk’s (2007) group-wise heterogeneous sub-sampling approach. We find that there was a significant negative effect on Hong Kong bank efficiency in 2001, which we ascribe to the fallout from the terrorist attacks in America in 9/11 and to the completion of deposit rate deregulation that year. However, post 2001 most banks have reported a steady increase in efficiency leading to a better ‘intermediation’ and ‘production’ of activities than in the base year of 2000, with the SARS epidemic having surprisingly little effect in 2003. It was also interesting to find that the smaller banks were more efficient than the larger banks, but the latter were also able to enjoy economies of scale. This size factor was linked to the exportability of financial services. Other environmental factors found to be significantly impacting on bank efficiency were private consumption and housing rent.Finance and Banking; Productivity; Efficiency.
    corecore