26 research outputs found

    Proportional Fairness in Clustering: A Social Choice Perspective

    Full text link
    We study the proportional clustering problem of Chen et al. [ICML'19] and relate it to the area of multiwinner voting in computational social choice. We show that any clustering satisfying a weak proportionality notion of Brill and Peters [EC'23] simultaneously obtains the best known approximations to the proportional fairness notion of Chen et al. [ICML'19], but also to individual fairness [Jung et al., FORC'20] and the "core" [Li et al. ICML'21]. In fact, we show that any approximation to proportional fairness is also an approximation to individual fairness and vice versa. Finally, we also study stronger notions of proportional representation, in which deviations do not only happen to single, but multiple candidate centers, and show that stronger proportionality notions of Brill and Peters [EC'23] imply approximations to these stronger guarantees

    Relaxed Core Stability for Hedonic Games with Size-Dependent Utilities

    Get PDF
    We study relationships between different relaxed notions of core stability in hedonic games. In particular, we study (i) q-size core stable outcomes in which no deviating coalition of size at most q exists and (ii) k-improvement core stable outcomes in which no coalition can improve by a factor of more than k. For a large class of hedonic games, including fractional and additively separable hedonic games, we derive upper bounds on the maximum factor by which a coalition of a certain size can improve in a q-size core stable outcome. We further provide asymptotically tight lower bounds for a large class of hedonic games. Finally, our bounds allow us to confirm two conjectures by Fanelli et al. [Angelo Fanelli et al., 2021][IJCAI\u2721] for symmetric fractional hedonic games (S-FHGs): (i) every q-size core stable outcome in an S-FHG is also q/(q-1)-improvement core stable and (ii) the price of anarchy of q-size stability in S-FHGs is precisely 2q/q-1

    Robust and verifiable proportionality axioms for multiwinner voting

    Get PDF
    When selecting a subset of candidates (a so-called committee) based on the preferences of voters, proportional representation is often a major desideratum. When going beyond simplistic models such as party-list or district-based elections, it is surprisingly challenging to capture proportionality formally. As a consequence, the literature has produced numerous competing criteria of when a selected committee qualifies as proportional. Two of the most prominent notions are proportionality for solid coalitions (PSC) [Dummett, 1984] and extended justified representation (EJR) [Aziz et al., 2017]. Both definitions guarantee proportional representation to groups of voters with very similar preferences; such groups are referred to as solid coalitions by Dummett and as cohesive groups by Aziz et al. However, they lose their bite when groups are only almost solid or cohesive

    Approval-Based Voting with Mixed Goods

    Full text link
    We consider a voting scenario in which the resource to be voted upon may consist of both indivisible and divisible goods. This setting generalizes both the well-studied model of multiwinner voting and the recently introduced model of cake sharing. Under approval votes, we propose two variants of the extended justified representation (EJR) notion from multiwinner voting, a stronger one called EJR for mixed goods (EJR-M) and a weaker one called EJR up to 1 (EJR-1). We extend three multiwinner voting rules to our setting -- GreedyEJR, the method of equal shares (MES), and proportional approval voting (PAV) -- and show that while all three generalizations satisfy EJR-1, only the first one provides EJR-M. In addition, we derive tight bounds on the proportionality degree implied by EJR-M and EJR-1, and investigate the proportionality degree of our proposed rules.Comment: Appears in the 37th AAAI Conference on Artificial Intelligence (AAAI), 202

    Proportionality in Approval-Based Participatory Budgeting

    Full text link
    The ability to measure the satisfaction of (groups of) voters is a crucial prerequisite for formulating proportionality axioms in approval-based participatory budgeting elections. Two common - but very different - ways to measure the satisfaction of a voter consider (i) the number of approved projects and (ii) the total cost of approved projects, respectively. In general, it is difficult to decide which measure of satisfaction best reflects the voters' true utilities. In this paper, we study proportionality axioms with respect to large classes of approval-based satisfaction functions. We establish logical implications among our axioms and related notions from the literature, and we ask whether outcomes can be achieved that are proportional with respect to more than one satisfaction function. We show that this is impossible for the two commonly used satisfaction functions when considering proportionality notions based on extended justified representation, but achievable for a notion based on proportional justified representation. For the latter result, we introduce a strengthening of priceability and show that it is satisfied by several polynomial-time computable rules, including the Method of Equal Shares and Phragm\`en's sequential rule

    A search for resonances decaying into a Higgs boson and a new particle X in the XH→qqbb final state with the ATLAS detector

    Get PDF
    A search for heavy resonances decaying into a Higgs boson (HH) and a new particle (XX) is reported, utilizing 36.1 fb1^{-1} of proton-proton collision data at s=\sqrt{s} = 13 TeV collected during 2015 and 2016 with the ATLAS detector at the CERN Large Hadron Collider. The particle XX is assumed to decay to a pair of light quarks, and the fully hadronic final state XHqqˉbbˉXH \rightarrow q\bar q'b\bar b is analysed. The search considers the regime of high XHXH resonance masses, where the XX and HH bosons are both highly Lorentz-boosted and are each reconstructed using a single jet with large radius parameter. A two-dimensional phase space of XHXH mass versus XX mass is scanned for evidence of a signal, over a range of XHXH resonance mass values between 1 TeV and 4 TeV, and for XX particles with masses from 50 GeV to 1000 GeV. All search results are consistent with the expectations for the background due to Standard Model processes, and 95% CL upper limits are set, as a function of XHXH and XX masses, on the production cross-section of the XHqqˉbbˉXH\rightarrow q\bar q'b\bar b resonance

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    The greenhouse gas emissions and mitigation options for materials used in UK construction

    Get PDF
    The UK construction industry faces the daunting task of replacing and extending a significant proportion of UK infrastructure, meeting a growing housing shortage and retrofitting millions of homes whilst achieving greenhouse gas (GHG) emission reductions compatible with the UK's legally binding target of an 80% reduction by 2050. This paper presents a detailed time series of embodied GHG emissions from the construction sector for 1997–2011. This data is used to demonstrate that strategies which focus solely on improving operational performance of buildings and the production efficiencies of domestic material producers will be insufficient to meet sector emission reduction targets. Reductions in the order of 80% will require a substantial decline in the use of materials with carbon-intensive supply chains. A variety of alternative materials, technologies and practices are available and the common barriers to their use are presented based upon an extensive literature survey. Key gaps in qualitative research, data and modelling approaches are also identified. Subsequent discussion highlights the lack of client and regulatory drivers for uptake of alternatives and the ineffective allocation of responsibility for emissions reduction within the industry. Only by addressing and overcoming all these challenges in combination can the construction sector achieve drastic emissions reduction

    schlably: A Python framework for deep reinforcement learning based scheduling experiments

    No full text
    Research on deep reinforcement learning (DRL) based production scheduling (PS) has gained a lot of attention in recent years, primarily due to the high demand for optimizing scheduling problems in diverse industry settings. Numerous studies are carried out and published as stand-alone experiments that often vary only slightly with respect to problem setups and solution approaches. The programmatic core of these experiments is typically very similar. Despite this fact, no standardized and resilient framework for experimentation on PS problems with DRL algorithms could be established so far. In this paper, we introduce schlably, a Python-based framework that provides researchers a comprehensive toolset to facilitate the development of PS solution strategies based on DRL. schlably eliminates the redundant overhead work that the creation of a sturdy and flexible backbone requires and increases the comparability and reusability of conducted research work
    corecore