545 research outputs found

    First-Fit is Linear on Posets Excluding Two Long Incomparable Chains

    Full text link
    A poset is (r + s)-free if it does not contain two incomparable chains of size r and s, respectively. We prove that when r and s are at least 2, the First-Fit algorithm partitions every (r + s)-free poset P into at most 8(r-1)(s-1)w chains, where w is the width of P. This solves an open problem of Bosek, Krawczyk, and Szczypka (SIAM J. Discrete Math., 23(4):1992--1999, 2010).Comment: v3: fixed some typo

    A Characterization of Mixed Unit Interval Graphs

    Full text link
    We give a complete characterization of mixed unit interval graphs, the intersection graphs of closed, open, and half-open unit intervals of the real line. This is a proper superclass of the well known unit interval graphs. Our result solves a problem posed by Dourado, Le, Protti, Rautenbach and Szwarcfiter (Mixed unit interval graphs, Discrete Math. 312, 3357-3363 (2012)).Comment: 17 pages, referees' comments adde

    Recognizing Members of the Tournament Equilibrium Set is NP-hard

    Full text link
    A recurring theme in the mathematical social sciences is how to select the "most desirable" elements given a binary dominance relation on a set of alternatives. Schwartz's tournament equilibrium set (TEQ) ranks among the most intriguing, but also among the most enigmatic, tournament solutions that have been proposed so far in this context. Due to its unwieldy recursive definition, little is known about TEQ. In particular, its monotonicity remains an open problem up to date. Yet, if TEQ were to satisfy monotonicity, it would be a very attractive tournament solution concept refining both the Banks set and Dutta's minimal covering set. We show that the problem of deciding whether a given alternative is contained in TEQ is NP-hard.Comment: 9 pages, 3 figure

    Mean-risk models using two risk measures: A multi-objective approach

    Get PDF
    This paper proposes a model for portfolio optimisation, in which distributions are characterised and compared on the basis of three statistics: the expected value, the variance and the CVaR at a specified confidence level. The problem is multi-objective and transformed into a single objective problem in which variance is minimised while constraints are imposed on the expected value and CVaR. In the case of discrete random variables, the problem is a quadratic program. The mean-variance (mean-CVaR) efficient solutions that are not dominated with respect to CVaR (variance) are particular efficient solutions of the proposed model. In addition, the model has efficient solutions that are discarded by both mean-variance and mean-CVaR models, although they may improve the return distribution. The model is tested on real data drawn from the FTSE 100 index. An analysis of the return distribution of the chosen portfolios is presented

    Behavioral implications of shortlisting procedures

    Get PDF
    We consider two-stage “shortlisting procedures” in which the menu of alternatives is first pruned by some process or criterion and then a binary relation is maximized. Given a particular first-stage process, our main result supplies a necessary and sufficient condition for choice data to be consistent with a procedure in the designated class. This result applies to any class of procedures with a certain lattice structure, including the cases of “consideration filters,” “satisficing with salience effects,” and “rational shortlist methods.” The theory avoids background assumptions made for mathematical convenience; in this and other respects following Richter’s classical analysis of preference-maximizing choice in the absence of shortlisting

    Divergent mathematical treatments in utility theory

    Get PDF
    In this paper I study how divergent mathematical treatments affect mathematical modelling, with a special focus on utility theory. In particular I examine recent work on the ranking of information states and the discounting of future utilities, in order to show how, by replacing the standard analytical treatment of the models involved with one based on the framework of Nonstandard Analysis, diametrically opposite results are obtained. In both cases, the choice between the standard and nonstandard treatment amounts to a selection of set-theoretical parameters that cannot be made on purely empirical grounds. The analysis of this phenomenon gives rise to a simple logical account of the relativity of impossibility theorems in economic theory, which concludes the paper

    Topological aggregation, the twin paradox and the No Show paradox

    Get PDF
    International audienceConsider the framework of topological aggregation introduced by Chichilnisky (1980). We prove that in this framework the Twin Paradox and the No Show Paradox cannot be avoided. Anonymity and unanimity are not needed to obtain these results

    Introducing the composite time trade-off: a test of feasibility and face validity

    Get PDF
    __Abstract__ __Introduction__ This study was designed to test the feasibility and face validity of the composite time trade-off (composite TTO), a new approach to TTO allowing for a more consistent elicitation of negative health state values. __Methods__ The new instrument combines a conventional TTO to elicit values for states regarded better than dead and a lead-time TTO for states worse than dead. __Results__ A total of 121 participants completed the composite TTO for ten EQ-5D-5L health states. Mean values ranged from −0.104 for health state 53555 to 0.946 for 21111. The instructions were clear to 98 % of the respondents, and 95 % found the task easy to understand, indicating feasibility. Further, the average number of steps taken in the iteration procedure to achieve the point of indifference in the TTO and the average duration of each task were indicative of a deliberate cognitive process. __Conclusion__ Face validity was confirmed by the high mean values for the mild health states (>0.90) and low mean values for the severe states (<0.42). In conclusion, this study demonstrates the feasibility and face validity of the composite TTO in a face-to-face standardized computer-assisted interview setting

    Reasons and Means to Model Preferences as Incomplete

    Full text link
    Literature involving preferences of artificial agents or human beings often assume their preferences can be represented using a complete transitive binary relation. Much has been written however on different models of preferences. We review some of the reasons that have been put forward to justify more complex modeling, and review some of the techniques that have been proposed to obtain models of such preferences

    Set optimization - a rather short introduction

    Full text link
    Recent developments in set optimization are surveyed and extended including various set relations as well as fundamental constructions of a convex analysis for set- and vector-valued functions, and duality for set optimization problems. Extensive sections with bibliographical comments summarize the state of the art. Applications to vector optimization and financial risk measures are discussed along with algorithmic approaches to set optimization problems
    corecore