3,291 research outputs found

    Decision-Making with Belief Functions: a Review

    Get PDF
    Approaches to decision-making under uncertainty in the belief function framework are reviewed. Most methods are shown to blend criteria for decision under ignorance with the maximum expected utility principle of Bayesian decision theory. A distinction is made between methods that construct a complete preference relation among acts, and those that allow incomparability of some acts due to lack of information. Methods developed in the imprecise probability framework are applicable in the Dempster-Shafer context and are also reviewed. Shafer's constructive decision theory, which substitutes the notion of goal for that of utility, is described and contrasted with other approaches. The paper ends by pointing out the need to carry out deeper investigation of fundamental issues related to decision-making with belief functions and to assess the descriptive, normative and prescriptive values of the different approaches

    Decision Making under Complex Uncertainty

    Get PDF

    Concepts for Decision Making under Severe Uncertainty with Partial Ordinal and Partial Cardinal Preferences

    Get PDF
    We introduce three different approaches for decision making under uncertainty if (I) there is only partial (both cardinally and ordinally scaled) information on an agent’s preferences and (II) the uncertainty about the states of nature is described by a credal set (or some other imprecise probabilistic model). Particularly, situation (I) is modeled by a pair of binary relations, one specifying the partial rank order of the alternatives and the other modeling partial information on the strength of preference. Our first approach relies on decision criteria constructing complete rankings of the available acts that are based on generalized expectation intervals. Subsequently, we introduce different concepts of global admissibility that construct partial orders between the available acts by comparing them all simultaneously. Finally, we define criteria induced by suitable binary relations on the set of acts and, therefore, can be understood as concepts of local admissibility. For certain criteria, we provide linear programming based algorithms for checking optimality/admissibility of acts. Additionally, the paper includes a discussion of a prototypical situation by means of a toy example

    Imprecise Markov Models for Scalable and Robust Performance Evaluation of Flexi-Grid Spectrum Allocation Policies

    Get PDF
    The possibility of flexibly assigning spectrum resources with channels of different sizes greatly improves the spectral efficiency of optical networks, but can also lead to unwanted spectrum fragmentation.We study this problem in a scenario where traffic demands are categorised in two types (low or high bit-rate) by assessing the performance of three allocation policies. Our first contribution consists of exact Markov chain models for these allocation policies, which allow us to numerically compute the relevant performance measures. However, these exact models do not scale to large systems, in the sense that the computations required to determine the blocking probabilities---which measure the performance of the allocation policies---become intractable. In order to address this, we first extend an approximate reduced-state Markov chain model that is available in the literature to the three considered allocation policies. These reduced-state Markov chain models allow us to tractably compute approximations of the blocking probabilities, but the accuracy of these approximations cannot be easily verified. Our main contribution then is the introduction of reduced-state imprecise Markov chain models that allow us to derive guaranteed lower and upper bounds on blocking probabilities, for the three allocation policies separately or for all possible allocation policies simultaneously.Comment: 16 pages, 7 figures, 3 table

    QUALITATIVE ANSWERING SURVEYS AND SOFT COMPUTING

    Get PDF
    In this work, we reflect on some questions about the measurement problem in economics and, especially, their relationship with the scientific method. Statistical sources frequently used by economists contain qualitative information obtained from verbal expressions of individuals by means of surveys, and we discuss the reasons why it would be more adequately analyzed with soft methods than with traditional ones. Some comments on the most commonly applied techniques in the analysis of these types of data with verbal answers are followed by our proposal to compute with words. In our view, an alternative use of the well known Income Evaluation Question seems especially suggestive for a computing with words approach, since it would facilitate an empirical estimation of the corresponding linguistic variable adjectives. A new treatment of the information contained in such surveys would avoid some questions incorporated in the so called Leyden approach that do not fit to the actual world.Computing with words, Leyden approach, qualitative answering surveys, fuzzy logic

    Severity-sensitive norm-governed multi-agent planning

    Get PDF
    This research was funded by Selex ES. The software developed during this research, including the norm analysis and planning algorithms, the simulator and harbour protection scenario used during evaluation is freely available from doi:10.5258/SOTON/D0139Peer reviewedPublisher PD

    Some contributions to decision making in complex information settings with imprecise probabilities and incomplete preferences

    Get PDF
    • 

    corecore