74 research outputs found

    The treatment of uncertainty in multicriteria decision making

    Get PDF
    Bibliography: leaves 142-149.The nature of human decision making dictates that a decision must often be considered under conditions of uncertainty. Decisions may be influenced by uncertain future events, doubts regarding the precision of inputs, doubts as to what the decision maker considers important, and many other forms of uncertainty. The multicriteria decision models that are designed to facilitate and aid decision making must therefore consider these uncertainties if they are to be effective. In this thesis, we consider the treatment of uncertainty in multicriteria decision making (MCDM), with a specific view to investigating the types of uncertainty that are most relevant to MCDM, [and] how the uncertainties identified as relevant may be treated by various different MCDM methodologies

    Simplified models for multi-criteria decision analysis under uncertainty

    Get PDF
    Includes abstract.Includes bibliographical references.When facilitating decisions in which some performance evaluations are uncertain, a decision must be taken about how this uncertainty is to be modelled. This involves, in part, choosing an uncertainty format {a way of representing the possible outcomes that may occur. It seems reasonable to suggest {and is an aim of the thesis to show {that the choice of how uncertain quantities are represented will exert some influence over the decision-making process and the final decision taken. Many models exist for multi-criteria decision analysis (MCDA) under conditions of uncertainty; perhaps the most well-known are those based on multi-attribute utility theory [MAUT, e.g. 147], which uses probability distributions to represent uncertainty. The great strength of MAUT is its axiomatic foundation, but even in its simplest form its practical implementation is formidable, and although there are several practical applications of MAUT reported in the literature [e.g. 39, 270] the number is small relative to its theoretical standing. Practical applications often use simpler decision models to aid decision making under uncertainty, based on uncertainty formats that `simplify' the full probability distributions (e.g. using expected values, variances, quantiles, etc). The aim of this thesis is to identify decision models associated with these `simplified' uncertainty formats and to evaluate the potential usefulness of these models as decision aids for problems involving uncertainty. It is hoped that doing so provides some guidance to practitioners about the types of models that may be used for uncertain decision making. The performance of simplified models is evaluated using three distinct methodological approaches {computer simulation, `laboratory' choice experiments, and real-world applications of decision analysis {in the hope of providing an integrated assessment. Chapter 3 generates a number of hypothetical decision problems by simulation, and within each problem simulates the hypothetical application of MAUT and various simplified decision models. The findings allow one to assess how the simplification of MAUT models might impact results, but do not provide any general conclusions because they are based on hypothetical decision problems and cannot evaluate practical issues like ease-of-use or the ability to generate insight that are critical to good decision aid. Chapter 4 addresses some of these limitations by reporting an experimental study consisting of choice tasks presented to numerate but unfacilitated participants. Tasks involved subjects selecting one from a set of five alternatives with uncertain attribute evaluations, with the format used to represent uncertainty and the number of objectives for the choice varied as part of the experimental design. The study is limited by the focus on descriptive rather than real prescriptive decision making, but has implications for prescriptive decision making practice in that natural tendencies are identified which may need to be overcome in the course of a prescriptive analysis

    Fuelling the zero-emissions road freight of the future: routing of mobile fuellers

    Get PDF
    The future of zero-emissions road freight is closely tied to the sufficient availability of new and clean fuel options such as electricity and Hydrogen. In goods distribution using Electric Commercial Vehicles (ECVs) and Hydrogen Fuel Cell Vehicles (HFCVs) a major challenge in the transition period would pertain to their limited autonomy and scarce and unevenly distributed refuelling stations. One viable solution to facilitate and speed up the adoption of ECVs/HFCVs by logistics, however, is to get the fuel to the point where it is needed (instead of diverting the route of delivery vehicles to refuelling stations) using "Mobile Fuellers (MFs)". These are mobile battery swapping/recharging vans or mobile Hydrogen fuellers that can travel to a running ECV/HFCV to provide the fuel they require to complete their delivery routes at a rendezvous time and space. In this presentation, new vehicle routing models will be presented for a third party company that provides MF services. In the proposed problem variant, the MF provider company receives routing plans of multiple customer companies and has to design routes for a fleet of capacitated MFs that have to synchronise their routes with the running vehicles to deliver the required amount of fuel on-the-fly. This presentation will discuss and compare several mathematical models based on different business models and collaborative logistics scenarios

    Automated Improvement of Software Architecture Models for Performance and Other Quality Attributes

    Get PDF
    Quality attributes, such as performance or reliability, are crucial for the success of a software system and largely influenced by the software architecture. Their quantitative prediction supports systematic, goal-oriented software design and forms a base of an engineering approach to software design. This thesis proposes a method and tool to automatically improve component-based software architecture (CBA) models based on such quantitative quality prediction techniques

    Noise and morphogenesis: Uncertainty, randomness and control

    Get PDF
    This thesis presents a processual ontology of noise by virtue of which morphogenesis (in its most general understanding as the processes by which order/form is created) must be instantiated. Noise is here outlined as the far from equilibrium environment out of which metastable temporary ‘solutions’ can emerge as the system transitions through the pre-individual state space. While frequently addressed by humanities and arts studies on the basis of its supposed disruptive character (often in terms of aesthetics), this thesis aims to thoroughly examine noise’s conceptual potencies. To explore and amplify the epistemic consequences not merely of the ineliminability of noise but of its originative power as well as within the course of the elimination of givenness by epistemology. This philosophical work is informed by many different fields of contemporary science (namely: statistical physics, information theory, probability theory, 4E cognition, synthetic biology, nonlinear dynamics, complexity science and computer science) in order to assess and highlight the problems of the metascientific and ideological foundations of diverse projects of prediction and control of uncertainty. From algorithmic surveillance back to cybernetics and how these rendered noise “informationally heretical”. This conveys an analysis of how contemporary prediction technologies are dramatically transforming our relationship with the future and with uncertainty in a great number of our social structures. It is a philosophico-critical anthropology of data ontology and a critique of reductive pan-info-computationalism. Additionally, two practical examples of noise characterised as an enabling constraint for the functioning of complex adaptive systems are presented. These are at once biophysical and cognitive, : 1) interaction-dominance constituted by ‘pink noise’ and 2) noise as a source of variability that cells may exploit in (synthetic) biology. Finally, noise is posited as an intractable active ontological randomness that limits the scope of determinism and that goes beyond unpredictability in any epistemological sense due to the insuperability of the situation in which epistemology finds itself following the critique of the given

    A quantitative real options method for aviation technology decision-making in the presence of uncertainty

    Get PDF
    The developments of new technologies for commercial aviation involve significant risk for technologists as these programs are often driven by fixed assumptions regarding future airline needs, while being subject to many uncertainties at the technical and market levels. To prioritize these developments, technologists must assess their economic viability even though standard methods used for capital budgeting are not well suited to handle the overwhelming uncertainty surrounding such developments. This research proposes a framework featuring real options to overcome this challenge. It is motivated by three observations: disregarding the value of managerial flexibility undervalues long-term research and development (R&D) programs; windows of opportunities emerge and disappear and manufacturers can derive significant value by exploiting their upside potential; integrating competitive aspects early in the design ensures that development programs are robust with respect to moves by the competition. Real options analyses have been proposed to address some of these points but the adoption has been slow, hindered by constraining frameworks. A panel of academics and practitioners has identified a set of requirements, known as the Georgetown Challenge, that real options analyses must meet to get more traction amongst practitioners in the industry. In a bid to meet some of these requirements, this research proposes a novel methodology, cross-fertilizing techniques from financial engineering, actuarial sciences, and statistics to evaluate and study the timing of technology developments under uncertainty. It aims at substantiating decision making for R&D while having a wider domain of application and an improved ability to handle a complex reality compared to more traditional approaches. The method named FLexible AViation Investment Analysis (FLAVIA) uses first Monte Carlo techniques to simulate the evolution of uncertainties driving the value of technology developments. A non-parametric Esscher transform is then applied to perform a change of probability measure to express these evolutions under the equivalent martingale measure. A bootstrap technique is suggested next to construct new non-weighted evolutions of the technology development value under the new measure. A regression-based technique is finally used to analyze the technology development program and to discover trigger boundaries which help define when the technology development program should be launched. Verification of the method is performed on several canonical examples and indicates good accuracy and competitive execution time. It is applied next to the analysis of a performance improvement package (PIP) development using the Integrated Cost And Revenue Estimation method (i-CARE) developed as part of this research. The PIP can be retrofitted to currently operating turbofan engines in order to mitigate the impact of the aging process on their operating costs. The PIP is subject to market uncertainties, such as the evolution of jet-fuel prices and the possible taxation of carbon emissions. The profitability of the PIP development is investigated and the value of managerial flexibility and timing flexibility are highlighted.The developments of new technologies for commercial aviation involve significant risk for technologists as these programs are often driven by fixed assumptions regarding future airline needs, while being subject to many uncertainties at the technical and market levels. To prioritize these developments, technologists must assess their economic viability even though standard methods used for capital budgeting are not well suited to handle the overwhelming uncertainty surrounding such developments. This research proposes a framework featuring real options to overcome this challenge. It is motivated by three observations: disregarding the value of managerial flexibility undervalues long-term research and development (R&D) programs; windows of opportunities emerge and disappear and manufacturers can derive significant value by exploiting their upside potential; integrating competitive aspects early in the design ensures that development programs are robust with respect to moves by the competition. Real options analyses have been proposed to address some of these points but the adoption has been slow, hindered by constraining frameworks. A panel of academics and practitioners has identified a set of requirements, known as the Georgetown Challenge, that real options analyses must meet to get more traction amongst practitioners in the industry. In a bid to meet some of these requirements, this research proposes a novel methodology, cross-fertilizing techniques from financial engineering, actuarial sciences, and statistics to evaluate and study the timing of technology developments under uncertainty. It aims at substantiating decision making for R&D while having a wider domain of application and an improved ability to handle a complex reality compared to more traditional approaches. The method named FLexible AViation Investment Analysis (FLAVIA) uses first Monte Carlo techniques to simulate the evolution of uncertainties driving the value of technology developments. A non-parametric Esscher transform is then applied to perform a change of probability measure to express these evolutions under the equivalent martingale measure. A bootstrap technique is suggested next to construct new non-weighted evolutions of the technology development value under the new measure. A regression-based technique is finally used to analyze the technology development program and to discover trigger boundaries which help define when the technology development program should be launched. Verification of the method is performed on several canonical examples and indicates good accuracy and competitive execution time. It is applied next to the analysis of a performance improvement package (PIP) development using the Integrated Cost And Revenue Estimation method (i-CARE) developed as part of this research. The PIP can be retrofitted to currently operating turbofan engines in order to mitigate the impact of the aging process on their operating costs. The PIP is subject to market uncertainties, such as the evolution of jet-fuel prices and the possible taxation of carbon emissions. The profitability of the PIP development is investigated and the value of managerial flexibility and timing flexibility are highlighted.Ph.D

    No Optimisation Without Representation: A Knowledge Based Systems View of Evolutionary/Neighbourhood Search Optimisation

    Get PDF
    Centre for Intelligent Systems and their ApplicationsIn recent years, research into ‘neighbourhood search’ optimisation techniques such as simulated annealing, tabu search, and evolutionary algorithms has increased apace, resulting in a number of useful heuristic solution procedures for real-world and research combinatorial and function optimisation problems. Unfortunately, their selection and design remains a somewhat ad hoc procedure and very much an art. Needless to say, this shortcoming presents real difficulties for the future development and deployment of these methods. This thesis presents work aimed at resolving this issue of principled optimiser design. Driven by the needs of both the end-user and designer, and their knowledge of the problem domain and the search dynamics of these techniques, a semi-formal, structured, design methodology that makes full use of the available knowledge will be proposed, justified, and evaluated. This methodology is centred around a Knowledge Based System (KBS) view of neighbourhood search with a number of well-defined knowledge sources that relate to specific hypotheses about the problem domain. This viewpoint is complemented by a number of design heuristics that suggest a structured series of hillclimbing experiments which allow these results to be empirically evaluated and then transferred to other optimisation techniques if desired. First of all, this thesis reviews the techniques under consideration. The case for the exploitation of problem-specific knowledge in optimiser design is then made. Optimiser knowledge is shown to be derived from either the problem domain theory, or the optimiser search dynamics theory. From this, it will be argued that the design process should be primarily driven by the problem domain theory knowledge as this makes best use of the available knowledge and results in a system whose behaviour is more likely to be justifiable to the end-user. The encoding and neighbourhood operators are shown to embody the main source of problem domain knowledge, and it will be shown how forma analysis can be used to formalise the hypotheses about the problem domain that they represent. Therefore it should be possible for the designer to experimentally evaluate hypotheses about the problem domain. To this end, proposed design heuristics that allow the transfer of results across optimisers based on a common hillclimbing class, and that can be used to inform the choice of evolutionary algorithm recombination operators, will be justified. In fact, the above approach bears some similarity to that of KBS design. Additional knowledge sources and roles will therefore be described and discussed, and it will be shown how forma analysis again plays a key part in their formalisation. Design heuristics for many of these knowledge sources will then be proposed and justified. This methodology will be evaluated by testing the validity of the proposed design heuristics in the context of two sequencing case studies. The first case study is a well-studied problem from operational research, the flowshop sequencing problem, which will provide a through test of many of the design heuristics proposed here. Also, an idle-time move preference heuristic will be proposed and demonstrated on both directed mutation and candidate list methods. The second case study applies the above methodology to design a prototype system for resource redistribution in the developing world, a problem that can be modelled as a very large transportation problem with non-linear constraints and objective function. The system, combining neighbourhood search with a constructive algorithm which reformulates the problem to one of sequencing, was able to produce feasible shipment plans for problems derived from data from the World Health Organisation’s TB programme in China that are much larger than those problems tackled by the current ‘state-of-the-art’ for transportation problems

    The Routledge Handbook of Philosophy of Economics

    Get PDF
    The most fundamental questions of economics are often philosophical in nature, and philosophers have, since the very beginning of Western philosophy, asked many questions that current observers would identify as economic. The Routledge Handbook of Philosophy of Economics is an outstanding reference source for the key topics, problems, and debates at the intersection of philosophical and economic inquiry. It captures this field of countless exciting interconnections, affinities, and opportunities for cross-fertilization. Comprising 35 chapters by a diverse team of contributors from all over the globe, the Handbook is divided into eight sections: I. Rationality II. Cooperation and Interaction III. Methodology IV. Values V. Causality and Explanation VI. Experimentation and Simulation VII. Evidence VIII. Policy The volume is essential reading for students and researchers in economics and philosophy who are interested in exploring the interconnections between the two disciplines. It is also a valuable resource for those in related fields like political science, sociology, and the humanities.</p
    corecore