10 research outputs found

    Min-max regret versus gross margin maximization in arable sector modeling

    Get PDF
    "A sector model presented in this article, uses about 200 representative French cereal-oriented farms to estimate policy impacts by means of mathematical modeling. Usually, such models suppose that farmers intend to maximize expected gross margin. This rationality hypothesis however seems hardly justifiable, especially these days, when gross margin variability due to European Common Agricultural Policy changes may become significant. Increasing uncertainty introduces bounded rationality to the decision problem so that crop gross margins may be better approximated by interval rather than by expected (precise) values. The initial LP problem is specified as an “Interval Linear Programming (ILP)”. We assume that farmers tend to decide upon their surface allocation prudently in order to get through with minimum loss, which is precisely the rationale underlying the minimization of maximum regret decision criterion. Recent advances in operations research, namely Mausser and Laguna algorithms, are exploited to implement the min-max regret criterion to arable agriculture ILP. The validation against observed crop mix proved that as uncertainty increases about 40% of the farmers adopt the min-max regret decision rule instead of the gross margin maximization."Interval Linear Programming, Min-Max Regret, Common Agricultural Policy, Arable cropping, France

    Positive multi-criteria models in agriculture for energy and environmental policy analysis

    Get PDF
    Environmental consciousness and accompanying actions have been paralleled by the evolution of multi-criteria methods which have provided tools to assist policy makers in discovering compromises in order to muddle through. This paper recalls the development of multi-criteria methods in agriculture, focusing on their contribution to produce input or output functions useful for environmental and/or energy policy. Response curves generated by MC models can more accurately predict farmers’ response to market and policy parameters compared with classic profit maximizing behavior. Concrete examples from recent literature illustrate the above statements and ideas for further research are provided.multi-criteria models, interval programming, supply curves, bio-energy, policy analysis

    New conditions for testing necessarily/possibly efficiency of non-degenerate basic solutions based on the tolerance approach

    Get PDF
    In this paper, a specific type of multiobjective linear programming problem with interval objective func- tion coefficients is studied. Usually, in such problems, it is not possible to obtain an optimal solution which optimizes simultaneously all objective functions in the interval multiobjective linear programming (IMOLP) problem, requiring the selection of a compromise solution. In conventional multiobjective pro- gramming problems these compromise solutions are called efficient solutions. However, the efficiency cannot be defined in a unique way in IMOLP problems. Necessary efficiency and possible efficiency have been considered as two natural extensions of efficiency to IMOLP problems. In this case, necessarily ef- ficient solutions may not exist and the set of possibly efficient solutions usually has an infinite number of elements. Furthermore, it has been concluded that the problem of checking necessary efficiency is co- NP-complete even for the case of only one objective function. In this paper, we explore new conditions for testing necessarily/possibly efficiency of basic non-degenerate solutions in IMOLP problems. We show properties of the necessarily efficient solutions in connection with possibly and necessarily optimal solu- tions to the related single objective problems. Moreover, we utilize the tolerance approach and sensitivity analysis for testing the necessary efficiency. Finally, based on the new conditions, a procedure to obtain some necessarily efficient and strictly possibly efficient solutions to multiobjective problems with interval objective functions is suggested.This research was partly supported by the Spanish Ministry of Economy and Competitiveness (project ECO2017-88883-R ) and by the Fundação para a CiĂȘncia e a Tecnologia (FCT) under project grant UID/Multi/00308/2019 . This work has been also partly sup- ported by the ConsejerĂ­a de InnovaciĂłn, Ciencia y Empresa de la Junta de AndalucĂ­a (PAI group SEJ-532 ). Carla Oliveira Henriques also acknowledges the training received from the University of Malaga PhD Programme in Economy and Business [Programa de Doctorado en EconomĂ­a y Empresa de la Universidad de Malaga]. JosĂ© Rui Figueira acknowledges the support from the FCT grant SFRH/BSAB/139892/2018 under POCH Program and to the DOME (Discrete Optimization Methods for Energy management) FCT Re- search Project (Ref: PTDC/CCI-COM/31198/2017)

    A compromise programming approach for target setting in DEA

    Get PDF
    This paper presents a new data envelopment analysis (DEA) target setting approach that uses the compromise programming (CP) method of multiobjective optimization. This method computes the ideal point associated to each decision making unit (DMU) and determines an ambitious, efficient target that is as close as possible (using an lp metric) to that ideal point. The specific cases p = 1, p = 2 and p = ∞ are separately discussed and analyzed. In particular, for p = 1 and p = ∞, a lexicographic optimization approach is proposed in order to guarantee uniqueness of the obtained target. The original CP method is translation invariant and has been adapted so that the proposed CP-DEA is also units invariant. An lp metric-based efficiency score is also defined for each DMU. The proposed CP-DEA approach can also be utilized in the presence of preference information, non-discretionary or integer variables and undesirable outputs. The proposed approach has been extensively compared with other DEA approaches on a dataset from the literature

    Decision support system for multi-objective forest management: a study in the Queen Elizabeth National Forest Park in Scotland

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:D91694 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    Sustainable Industrial Engineering along Product-Service Life Cycle/Supply Chain

    Get PDF
    Sustainable industrial engineering addresses the sustainability issue from economic, environmental, and social points of view. Its application fields are the whole value chain and lifecycle of products/services, from the development to the end-of-life stages. This book aims to address many of the challenges faced by industrial organizations and supply chains to become more sustainable through reinventing their processes and practices, by continuously incorporating sustainability guidelines and practices in their decisions, such as circular economy, collaboration with suppliers and customers, using information technologies and systems, tracking their products’ life-cycle, using optimization methods to reduce resource use, and to apply new management paradigms to help mitigate many of the wastes that exist across organizations and supply chains. This book will be of interest to the fast-growing body of academics studying and researching sustainability, as well as to industry managers involved in sustainability management

    An Integrated Model for Supplier Quality Evaluation

    Get PDF
    An Integrated Model for Supplier Quality Evaluation Aqeel Asaad Al Salem Supplier quality evaluation is a multi-criteria decision-making (MCDM) problem that involves multiple, heterogeneous criteria of different weights. The literature addresses quality, delivery, technology, value and service as the five most common criteria used for supplier quality evaluation. In this thesis, we have considered the most important criteria for evaluating the quality of suppliers based on a review of the literature and observation in practice. They include both qualitative and quantitative criteria to reflect the real attributes of the supplier in question, and are applied in a supplier quality evaluation performed for a large data set. We propose a three-stage model for performing supplier quality evaluation. In the first stage, we identify the evaluation criteria and assign a weight to each criterion. The analytic hierarchy process (AHP) technique is used in this stage. In the second stage, we address the large size of suppliers’ datasets and present a cluster-analysis-based approach to obtain manageable supplier datasets for evaluation purposes. In the third stage, we apply the VIKOR method to evaluate supplier quality in the clusters obtained from the previous stage. A numerical application is provided to demonstrate the proposed approach. The strength of the proposed model lies in the integrated application of the three techniques, in which each technique is best suited for its respective problem. The model’s other chief advantage is its ability to deal efficiently with the challenge of evaluating large numbers of suppliers and the data pertaining to their attributes

    Optimization Models Using Fuzzy Sets and Possibility Theory

    Get PDF
    Optimization is of central concern to a number of disciplines. Operations Research and Decision Theory are often considered to be identical with optimization. But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i.e. the solutions were considered to be either feasible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeler to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real problems. This is particularly true if the problem under consideration includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if natural language has to be modeled or if state variables can only be described approximately. Until recently, everything which was not known with certainty, i.e. which was not known to be either true or false or which was not known to either happen with certainty or to be impossible to occur, was modeled by means of probabilities. This holds in particular for uncertainties concerning the occurrence of events. probability theory was used irrespective of whether its axioms (such as, for instance, the law of large numbers) were satisfied or not, or whether the "events" could really be described unequivocally and crisply. In the meantime one has become aware of the fact that uncertainties concerning the occurrence as well as concerning the description of events ought to be modeled in a much more differentiated way. New concepts and theories have been developed to do this: the theory of evidence, possibility theory, the theory of fuzzy sets have been advanced to a stage of remarkable maturity and have already been applied successfully in numerous cases and in many areas. Unluckily, the progress in these areas has been so fast in the last years that it has not been documented in a way which makes these results easily accessible and understandable for newcomers to these areas: text-books have not been able to keep up with the speed of new developments; edited volumes have been published which are very useful for specialists in these areas, but which are of very little use to nonspecialists because they assume too much of a background in fuzzy set theory. To a certain degree the same is true for the existing professional journals in the area of fuzzy set theory. Altogether this volume is a very important and appreciable contribution to the literature on fuzzy set theory
    corecore