10,813 research outputs found

    A linear relational DEA model to evaluate two-stage processes with shared inputs

    Get PDF
    Two-stage data envelopment analysis (DEA) efficiency models identify the efficient frontier of a two-stage production process. In some two-stage processes, the inputs to the first stage are shared by the second stage, known as shared inputs. This paper proposes a new relational linear DEA model for dealing with measuring the efficiency score of two-stage processes with shared inputs under constant returns-to-scale assumption. Two case studies of banking industry and university operations are taken as two examples to illustrate the potential applications of the proposed approach

    A network Data Envelopment Analysis to estimate nations’ efficiency in the fight against SARS-CoV-2

    Get PDF
    The ongoing outbreak of SARS-CoV-2 has been deeply impacting health systems worldwide. In this context, it is pivotal to measure the efficiency of different nations’ response to the pandemic, whose insights can be used by governments and health authorities worldwide to improve their national COVID-19 strategies. Hence, we propose a network Data Envelopment Analysis (DEA) to estimate the efficiencies of fifty-five countries in the current crisis, including the thirty-seven Organisation for Economic Co-operation and Development (OECD) member countries, six OECD prospective members, four OECD key partners, and eight other countries. The network DEA model is designed as a general series structure with five single-division stages – population, contagion, triage, hospitalisation, and intensive care unit admission –, and considers an output maximisation orientation, denoting a social perspective, and an input minimisation orientation, denoting a financial perspective. It includes inputs related to health costs, desirable and undesirable intermediate products related to the use of personal protective equipment and infected population, respectively, and desirable and undesirable outputs regarding COVID-19 recoveries and deaths, respectively. To the best of the authors’ knowledge, this is the first study proposing a cross-country efficiency measurement using a network DEA within the context of the COVID-19 crisis. The study concludes that Estonia, Iceland, Latvia, Luxembourg, the Netherlands, and New Zealand are the countries exhibiting higher mean system efficiencies. Their national COVID-19 strategies should be studied, adapted, and used by countries exhibiting worse performances. In addition, the observation of countries with large populations presenting worse mean efficiency scores is statistically significant.info:eu-repo/semantics/publishedVersio

    Efficiency evaluation of parallel interdependent processes systems: an application to Chinese 985 Project universities

    Get PDF
    Data envelopment analysis (DEA) has been widely applied in measuring the efficiency of homogeneous decision-making units. Network DEA, as an important branch of DEA, was built to examine the internal structure of a system, whereas traditional DEA models regard a system as a ‘black box’. However, only a few previous studies on parallel systems have considered the interdependent relationship between system components. In recent years, parallel interdependent processes systems commonly exist in production systems because of serious competition among organisations. Thus, an approach to measure the efficiency of such systems should be proposed. This paper builds an additive DEA model to measure a parallel interdependent processes system with two components which have an interdependent relationship. Then, the model is applied to analyse the ‘985 Project’ universities in China, and certain policy implications are explained

    Human Ecology: Industrial Ecology

    Get PDF
    Industrial Ecology aims to inform decision making about the environmental impacts of industrial production processes by tracking and analyzing resource use and flows of industrial products, consumer products and wastes. Quantifying the patterns of use of materials and energy in different societies is one area of research in Industrial Ecology. An extensive literature is devoted in particular to Material Flow Analysis (MFA), the collection of data describing the flows of specific materials from sources to sinks within some portion of the global industrial system. Industrial Ecologists are also concerned with the system-wide environmental impacts associated with products. Design for the Environment involves the design or redesign of specific products so as to reduce their impacts, while Life Cycle Analysis (LCA) quantifies resource use and emissions per unit of product from material extraction to the eventual disposal of the product. The LCA community has created a significant body of best-practice methods and shared data and increasingly incorporates their analyses within input-output models of entire economies to capture that portion of the impact that would otherwise be overlooked. Input-output models, often incorporating both MFA and LCA data, analyze the effects on the environment of alternative consumption and production decisions. Industrial Ecology makes use of this array of top-down and bottom-up approaches, all of which are grounded in its origins in the ecology of the industrial system.

    Improving energy efficiency considering reduction of CO2 emission of turnip production:A novel data envelopment analysis model with undesirable output approach

    Get PDF
    Modern Turnip production methods need significant amount of direct and indirect energy. The optimum use of agricultural input resources results in the increase of efficiency and the decrease of the carbon footprint of turnip production. Data Envelopment Analysis (DEA) approach is a well-known technique utilized to evaluate the efficiency for peer units compared with the best practice frontier, widely used by researches to analyze the performance of agricultural sector. In this regard, a new non-radial DEA-based efficiency model is designed to investigate the efficiency of turnip farms. For this purpose, five inputs and two outputs are considered. The outputs consist turnip yield as a desirable output and greenhouse gas emission as an undesirable output. The new model projects each DMU on the strong efficient frontier. Several important properties are stated and proved which show the capabilities of our proposed model. The new models are applied in evaluating 30 turnip farms in Fars, Iran. This case study demonstrates the efficiency of our proposed models. The target inputs and outputs for these farms are also calculated and the benchmark farm for each DMU is determined. Finally, the reduction of CO2 emission for each turnip farm is evaluated. Compared with other factors like human labor, diesel fuel, seed and fertilizers, one of the most important findings is that machinery has the highest contribution to the total target energy saving. Besides, the average target emission of turnip production in the region is 7% less than the current emission

    Many-Task Computing and Blue Waters

    Full text link
    This report discusses many-task computing (MTC) generically and in the context of the proposed Blue Waters systems, which is planned to be the largest NSF-funded supercomputer when it begins production use in 2012. The aim of this report is to inform the BW project about MTC, including understanding aspects of MTC applications that can be used to characterize the domain and understanding the implications of these aspects to middleware and policies. Many MTC applications do not neatly fit the stereotypes of high-performance computing (HPC) or high-throughput computing (HTC) applications. Like HTC applications, by definition MTC applications are structured as graphs of discrete tasks, with explicit input and output dependencies forming the graph edges. However, MTC applications have significant features that distinguish them from typical HTC applications. In particular, different engineering constraints for hardware and software must be met in order to support these applications. HTC applications have traditionally run on platforms such as grids and clusters, through either workflow systems or parallel programming systems. MTC applications, in contrast, will often demand a short time to solution, may be communication intensive or data intensive, and may comprise very short tasks. Therefore, hardware and software for MTC must be engineered to support the additional communication and I/O and must minimize task dispatch overheads. The hardware of large-scale HPC systems, with its high degree of parallelism and support for intensive communication, is well suited for MTC applications. However, HPC systems often lack a dynamic resource-provisioning feature, are not ideal for task communication via the file system, and have an I/O system that is not optimized for MTC-style applications. Hence, additional software support is likely to be required to gain full benefit from the HPC hardware

    Cultural ecosystem services: stretching out the concept

    Get PDF

    Study of fault-tolerant software technology

    Get PDF
    Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance
    corecore