1,531 research outputs found

    Getting People into Work: What (if Anything) Can Justify Mandatory Activation of Welfare Recipients?

    Full text link
    So-called activation policies aiming at bringing jobless people into work have been a central component of welfare reforms across OECD countries during the last decades. Such policies combine restrictive and enabling programs, but their characteristic feature is that also enabling programs are mandatory, and non-compliers are sanctioned. There are four main arguments that can be used to defend mandatory activation of benefit recipients. We label them efficiency, sustainability, paternalism, and justice. Each argument is analyzed in turn and according to a strict scheme. First we clarify which standards it invokes. Thereafter we evaluate each argument according to its own standards Finally we introduce competing normative concerns that have to be taken into account. In the conclusion we discuss possible constellations of arguments that make up the normative space for activation policies

    The impact of anticipated discussion on cooperation in a social dilemma

    Get PDF
    We study the impact of anticipated face-to-face discussions among group members after they have made an anonymous contribution to a public good in an experimental setting. We find that the impact of anticipated discussions depends on how we frame the public good game. When framed in non-evaluative language, anticipated ex post discussions lead to a sharp reduction in contributions to the public good. This effect reversed when evaluative language was used to underscore normative expectations. In contrast, there was no framing in the no-discussion baseline version of our game. We offer an explanation that centres on the idea that the announcement of ex post discussions reinforces both normative and predictive expectations.Public Goods; Laboratory; Individual Behavior

    Chapter Introduction

    Get PDF
    democracy; normative theory; political theory; public administration; public policy; policymakin

    Cooperation driven by mutations in multi-person Prisoner's Dilemma

    Full text link
    The n-person Prisoner's Dilemma is a widely used model for populations where individuals interact in groups. The evolutionary stability of populations has been analysed in the literature for the case where mutations in the population may be considered as isolated events. For this case, and assuming simple trigger strategies and many iterations per game, we analyse the rate of convergence to the evolutionarily stable populations. We find that for some values of the payoff parameters of the Prisoner's Dilemma this rate is so low that the assumption, that mutations in the population are infrequent on that timescale, is unreasonable. Furthermore, the problem is compounded as the group size is increased. In order to address this issue, we derive a deterministic approximation of the evolutionary dynamics with explicit, stochastic mutation processes, valid when the population size is large. We then analyse how the evolutionary dynamics depends on the following factors: mutation rate, group size, the value of the payoff parameters, and the structure of the initial population. In order to carry out the simulations for groups of more than just a few individuals, we derive an efficient way of calculating the fitness values. We find that when the mutation rate per individual and generation is very low, the dynamics is characterised by populations which are evolutionarily stable. As the mutation rate is increased, other fixed points with a higher degree of cooperation become stable. For some values of the payoff parameters, the system is characterised by (apparently) stable limit cycles dominated by cooperative behaviour. The parameter regions corresponding to high degree of cooperation grow in size with the mutation rate, and in number with the group size.Comment: 22 pages, 7 figures. Accepted for publication in Journal of Theoretical Biolog

    Proxy Measures for Simplified Environmental Assessment of Manufactured Nanomaterials

    Get PDF
    Proxy measures have been proposed as a low-data option for simplified assessment of environmental threat given the high complexity of the natural environment. We here review studies of environmental release, fate, toxicity, and risk to identify relevant proxy measures for manufactured nanomaterials (MNMs). In total, 18 potential proxy measures were identified and evaluated regarding their link to environmental risk, an aspect of relevance, and data availability, an aspect of practice. They include socio-technical measures (e.g., MNM release), particle-specific measures (e.g., particle size), partitioning coefficients (e.g., the octanol–water coefficient), and other fate-related measures (e.g., half-life) as well as various ecotoxicological measures (e.g., 50% effect concentration). For most identified proxy measures, the link to environmental risk was weak and data availability low. Two exceptions were global production volume and ecotoxicity, for which the links to environmental risk are strong and data availability relatively decent. As proof of concept, these were employed to assess seven MNMs: titanium dioxide, cerium dioxide, zinc oxide, silver, silicon dioxide, carbon nanotubes, and graphene. The results show that none of the MNMs have both high production volumes and high ecotoxicity. Several refinements of the assessment are possible, such as higher resolution regarding the MNMs assessed (e.g., different allotropes) and different metrics (e.g., particle number and surface area). The proof of concept shows the feasibility of using proxy measures for environmental assessment of MNMs, in particular for novel MNMs in early technological development, when data is particularly scarce

    Environmental Assessment of Emerging Technologies: Recommendations for Prospective LCA

    Get PDF
    The challenge of assessing emerging technologies with life cycle assessment (LCA) has been increasingly discussed in the LCA field. In this article, we propose a definition of prospective LCA: An LCA is prospective when the (emerging) technology studied is in an early phase of development (e.g., small-scale production), but the technology is modeled at a future, more-developed phase (e.g., large-scale production). Methodological choices in prospective LCA must be adapted to reflect this goal of assessing environmental impacts of emerging technologies, which deviates from the typical goals of conventional LCA studies. The aim of the article is to provide a number of recommendations for how to conduct such prospective assessments in a relevant manner. The recommendations are based on a detailed review of selected prospective LCA case studies, mainly from the areas of nanomaterials, biomaterials, and energy technologies. We find that it is important to include technology alternatives that are relevant for the future in prospective LCA studies. Predictive scenarios and scenario ranges are two general approaches to prospective inventory modeling of both foreground and background systems. Many different data sources are available for prospective modeling of the foreground system: scientific articles; patents; expert interviews; unpublished experimental data; and process modeling. However, we caution against temporal mismatches between foreground and background systems, and recommend that foreground and background system impacts be reported separately in order to increase the usefulness of the results in other prospective studies

    Measurement of the production of a W boson in association with a charm quark in pp collisions at √s = 7 TeV with the ATLAS detector

    Get PDF
    The production of a W boson in association with a single charm quark is studied using 4.6 fb−1 of pp collision data at s√ = 7 TeV collected with the ATLAS detector at the Large Hadron Collider. In events in which a W boson decays to an electron or muon, the charm quark is tagged either by its semileptonic decay to a muon or by the presence of a charmed meson. The integrated and differential cross sections as a function of the pseudorapidity of the lepton from the W-boson decay are measured. Results are compared to the predictions of next-to-leading-order QCD calculations obtained from various parton distribution function parameterisations. The ratio of the strange-to-down sea-quark distributions is determined to be 0.96+0.26−0.30 at Q 2 = 1.9 GeV2, which supports the hypothesis of an SU(3)-symmetric composition of the light-quark sea. Additionally, the cross-section ratio σ(W + +c¯¯)/σ(W − + c) is compared to the predictions obtained using parton distribution function parameterisations with different assumptions about the s−s¯¯¯ quark asymmetry

    Measurements of fiducial and differential cross sections for Higgs boson production in the diphoton decay channel at s√=8 TeV with ATLAS

    Get PDF
    Measurements of fiducial and differential cross sections are presented for Higgs boson production in proton-proton collisions at a centre-of-mass energy of s√=8 TeV. The analysis is performed in the H → γγ decay channel using 20.3 fb−1 of data recorded by the ATLAS experiment at the CERN Large Hadron Collider. The signal is extracted using a fit to the diphoton invariant mass spectrum assuming that the width of the resonance is much smaller than the experimental resolution. The signal yields are corrected for the effects of detector inefficiency and resolution. The pp → H → γγ fiducial cross section is measured to be 43.2 ±9.4(stat.) − 2.9 + 3.2 (syst.) ±1.2(lumi)fb for a Higgs boson of mass 125.4GeV decaying to two isolated photons that have transverse momentum greater than 35% and 25% of the diphoton invariant mass and each with absolute pseudorapidity less than 2.37. Four additional fiducial cross sections and two cross-section limits are presented in phase space regions that test the theoretical modelling of different Higgs boson production mechanisms, or are sensitive to physics beyond the Standard Model. Differential cross sections are also presented, as a function of variables related to the diphoton kinematics and the jet activity produced in the Higgs boson events. The observed spectra are statistically limited but broadly in line with the theoretical expectations

    Search for squarks and gluinos in events with isolated leptons, jets and missing transverse momentum at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for supersymmetry in final states containing at least one isolated lepton (electron or muon), jets and large missing transverse momentum with the ATLAS detector at the Large Hadron Collider are reported. The search is based on proton-proton collision data at a centre-of-mass energy s√=8 TeV collected in 2012, corresponding to an integrated luminosity of 20 fb−1. No significant excess above the Standard Model expectation is observed. Limits are set on supersymmetric particle masses for various supersymmetric models. Depending on the model, the search excludes gluino masses up to 1.32 TeV and squark masses up to 840 GeV. Limits are also set on the parameters of a minimal universal extra dimension model, excluding a compactification radius of 1/R c = 950 GeV for a cut-off scale times radius (ΛR c) of approximately 30

    Measurement of χ c1 and χ c2 production with s√ = 7 TeV pp collisions at ATLAS

    Get PDF
    The prompt and non-prompt production cross-sections for the χ c1 and χ c2 charmonium states are measured in pp collisions at s√ = 7 TeV with the ATLAS detector at the LHC using 4.5 fb−1 of integrated luminosity. The χ c states are reconstructed through the radiative decay χ c → J/ψγ (with J/ψ → μ + μ −) where photons are reconstructed from γ → e + e − conversions. The production rate of the χ c2 state relative to the χ c1 state is measured for prompt and non-prompt χ c as a function of J/ψ transverse momentum. The prompt χ c cross-sections are combined with existing measurements of prompt J/ψ production to derive the fraction of prompt J/ψ produced in feed-down from χ c decays. The fractions of χ c1 and χ c2 produced in b-hadron decays are also measured
    corecore