40 research outputs found

    σ-µ efficiency analysis: A new methodology for evaluating units through composite indices

    Get PDF
    We propose a new methodology to employ composite indicators for performance analysis of units of interest using Stochastic Multiattribute Acceptability Analysis. We start evaluating each unit by means of weighted sums of their elementary indicators in the whole set of admissible weights. For each unit, we compute the mean, µ, and the standard deviation, σ, of its evaluations. Clearly, the former has to be maximized, while the latter has to be minimized as it denotes instability in the evaluations with respect to the variability of weights. We consider a unit to be Pareto-Koopmans efficient with respect to µ and σ if there is no convex combination of µ and σ of the rest of the units with a value of µ that is not smaller, and a value of σ that is not greater, with at least one strict inequality. The set of all Pareto-Koopmans efficient units constitutes the first Pareto-Koopmans frontier. By removing this set and computing the efficiency frontier for the rest of the units, one could obtain the second Pareto-Koopmans frontier. Analogously, the third, fourth and so on Pareto-Koopmans frontiers can be defined. This permits to assign each unit to one of this sequence of Pareto-Koopmans frontiers. We measure the efficiency of each unit not only with respect to the first Pareto-Koopmans frontier, as in the classic Data Envelopment Analysis, but also with respect to the rest of the frontiers, thus enhancing the explicative power of the proposed approach. To illustrate its potential, we apply it to a case study of world happiness based on the data of the homonymous report, annually produced by the United Nations’ Sustainable Development Solutions Network

    σ-µ efficiency analysis: A new methodology for evaluating units through composite indices

    Get PDF
    We propose a new methodology to employ composite indicators for performance analysis of units of interest using Stochastic Multiattribute Acceptability Analysis. We start evaluating each unit by means of weighted sums of their elementary indicators in the whole set of admissible weights. For each unit, we compute the mean, µ, and the standard deviation, σ, of its evaluations. Clearly, the former has to be maximized, while the latter has to be minimized as it denotes instability in the evaluations with respect to the variability of weights. We consider a unit to be Pareto-Koopmans efficient with respect to µ and σ if there is no convex combination of µ and σ of the rest of the units with a value of µ that is not smaller, and a value of σ that is not greater, with at least one strict inequality. The set of all Pareto-Koopmans efficient units constitutes the first Pareto-Koopmans frontier. By removing this set and computing the efficiency frontier for the rest of the units, one could obtain the second Pareto-Koopmans frontier. Analogously, the third, fourth and so on Pareto-Koopmans frontiers can be defined. This permits to assign each unit to one of this sequence of Pareto-Koopmans frontiers. We measure the efficiency of each unit not only with respect to the first Pareto-Koopmans frontier, as in the classic Data Envelopment Analysis, but also with respect to the rest of the frontiers, thus enhancing the explicative power of the proposed approach. To illustrate its potential, we apply it to a case study of world happiness based on the data of the homonymous report, annually produced by the United Nations’ Sustainable Development Solutions Network

    Usando modelos de optimización para alcanzar soluciones en técnicas de clasificación y clusterización

    Get PDF
    Descargue el texto completo en el repositorio institucional de la Universidade Estadual de Campinas: https://hdl.handle.net/20.500.12733/1641108Esta tesis pretende estudiar algunas técnicas de manejo de conjuntos de datos a gran escala para extraer información representativa a partir del uso de la programación matemática. Los patrones estructurales de los datos proporcionan piezas de información que pueden ser utilizadas para clasificar y agruparlos mediante la solución óptima de problemas de optimización específicos. Las técnicas utilizadas podrían confrontarse con enfoques de aprendizaje automático para suministrar nuevas posibilidades numéricas de resolución. Las pruebas computacionales realizadas sobre dos casos de estudio con datos reales (experimentos prácticos) validan esta investigación. Los análisis se realizan para la conocida base de datos sobre la identificación de tumores de cáncer de mama, que tienen un diagnóstico maligno o benigno, y también para una base de datos de animales bovinos que contiene características físicas y de raza de cada animal pero con patrones desconocidos. Para el primer caso de estudio se propone una clasificación binaria basada en una formulación de programación de objetivos. En el estudio realizado sobre las características de los animales bovinos el interés es identificar patrones entre los diferentes animales agrupándolos a partir de las soluciones de un modelo de optimización lineal entero. Los resultados computacionales se estudian a partir de un conjunto de procedimientos estadísticos descriptivos para validar esta investigación.This dissertation aims to study some techniques for handling large scale datasets to extract representative information from the use of mathematical programming. The structural patterns of data provide pieces of information that can be used to classify and cluster them through the optimal solution of specific optimization problems. The techniques used could be confronted with machine learning approaches to supply new numerical possibilities of resolution. Computational tests conducted on two case studies with real data (practical experiments) validate this research. The analyzes are done for the well-known database on the identification of breast cancer tumors, which either have a malignant or have a benign diagnosis, and also for a bovine animal database containing physical and breed characteristics of each animal but with unknown patterns. A binary classification based on a goal programming formulation is suggested for the first case study. In the study conducted on the characteristics of bovine animals, the interest is to identify patterns among the different animals by grouping them from the solutions of an integer linear optimization model. The computational results are studied from a set of descriptive statistical procedures to validate this research.Brasil. Universidade Estadual de Campinas. Fundação de Desenvolvimento (Funcamp

    sigma-mu efficiency analysis: A new methodology for evaluating units through composite indices

    Get PDF
    We propose a new methodology to employ composite indicators for performance analysis of units of interest using and extending the family of Stochastic Multiattribute Acceptability Analysis. We start evaluating each unit by means of weighted sums of their elementary indicators in the whole set of admissible weights. For each unit, we compute the mean, �, and the standard deviation, �, of its evaluations. Clearly, the former has to be maximized, while the latter has to be minimized as it denotes instability in the evaluations with respect to the variability of weights. We consider a unit to be Pareto-Koopmans efficient with respect to � and � if there is no convex combination of � and � of the rest of the units with a value of � that is not smaller, and a value of � that is not greater, with at least one strict inequality. The set of all Pareto-Koopmans efficient units constitutes the first Pareto-Koopmans frontier. In the spirit of context-dependent Data Envelopment Analysis, we assign each unit to one of the sequences of Pareto-Koopmans frontiers. We measure the local efficiency of each unit with respect to each frontier, but also its global efficiency taking into account all feasible frontiers in the

    Machine assisted quantitative seismic interpretation

    Get PDF
    During the past decades, the size of 3D seismic data volumes and the number of seismic attributes have increased to the extent that it is difficult, if not impossible, for interpreters to examine every seismic line and time slice. Reducing the labor associated with seismic interpretation while increasing the reliability of the interpreted result has been an on going challenge that becomes increasingly more difficult with the amount of data available to interpreters. To address this issue, geoscientists often adopt concepts and algorithms from fields such as image processing, signal processing, and statistics, with much of the focus on auto-picking and automatic seismic facies analysis. I focus my research on adapting and improving machine learning and pattern recognition methods for automatic seismic facies analysis. Being an emerging and rapid developing topic, there is an endless list of machine learning and pattern recognition techniques available to scientific researchers. More often, the obstacle that prevents geoscientists from using such techniques is the “black box” nature of such techniques. Interpreters may not know the assumptions and limitations of a given technique, resulting in subsequent choices that may be suboptimum. In this dissertation, I provide a review of the more commonly used seismic facies analysis algorithms. My goal is to assist seismic interpreters in choosing the best method for a specific problem. Moreover, because all these methods are just generic mathematic tools that solve highly abstract, analytical problems, we have to tailor them to fit seismic interpretation problems. Self-organizing map (SOM) is a popular unsupervised learning technique that interpreters use to explore seismic facies using multiple seismic attributes as input. It projects the high dimensional seismic attribute data onto a lower dimensional (usually 2D) space in which interpreters are able to identify clusters of seismic facies. In this dissertation, using SOM as an example, I provide three improvements on the traditional algorithm, in order to present the information residing in the seismic attributes more adequately, and therefore reducing the uncertainly in the generated seismic facies map

    Conflicting Objectives in Decisions

    Get PDF
    This book deals with quantitative approaches in making decisions when conflicting objectives are present. This problem is central to many applications of decision analysis, policy analysis, operational research, etc. in a wide range of fields, for example, business, economics, engineering, psychology, and planning. The book surveys different approaches to the same problem area and each approach is discussed in considerable detail so that the coverage of the book is both broad and deep. The problem of conflicting objectives is of paramount importance, both in planned and market economies, and this book represents a cross-cultural mixture of approaches from many countries to the same class of problem

    σ-µ efficiency analysis: A new methodology for evaluating units through composite indices

    Get PDF
    We propose a new methodology to employ composite indicators for performance analysis of units of interest using and extending the family of Stochastic Multiattribute Acceptability Analysis. We start evaluating each unit by means of weighted sums of their elementary indicators in the whole set of admissible weights. For each unit, we compute the mean, �, and the standard deviation, �, of its evaluations. Clearly, the former has to be maximized, while the latter has to be minimized as it denotes instability in the evaluations with respect to the variability of weights. We consider a unit to be Pareto-Koopmans efficient with respect to � and � if there is no convex combination of � and � of the rest of the units with a value of � that is not smaller, and a value of � that is not greater, with at least one strict inequality. The set of all Pareto-Koopmans efficient units constitutes the first Pareto-Koopmans frontier. In the spirit of context-dependent Data Envelopment Analysis, we assign each unit to one of the sequences of Pareto-Koopmans frontiers. We measure the local efficiency of each unit with respect to each frontier, but also its global efficiency taking into account all feasible frontiers in the

    sigma-mu efficiency analysis: A new methodology for evaluating units through composite indices

    Get PDF
    We propose a new methodology to employ composite indicators for performance analysis of units of interest using and extending the family of Stochastic Multiattribute Acceptability Analysis. We start evaluating each unit by means of weighted sums of their elementary indicators in the whole set of admissible weights. For each unit, we compute the mean, �, and the standard deviation, �, of its evaluations. Clearly, the former has to be maximized, while the latter has to be minimized as it denotes instability in the evaluations with respect to the variability of weights. We consider a unit to be Pareto-Koopmans efficient with respect to � and � if there is no convex combination of � and � of the rest of the units with a value of � that is not smaller, and a value of � that is not greater, with at least one strict inequality. The set of all Pareto-Koopmans efficient units constitutes the first Pareto-Koopmans frontier. In the spirit of context-dependent Data Envelopment Analysis, we assign each unit to one of the sequences of Pareto-Koopmans frontiers. We measure the local efficiency of each unit with respect to each frontier, but also its global efficiency taking into account all feasible frontiers in the

    Tree Models for Interpretable Agents

    Get PDF
    corecore