1,028 research outputs found

    Smoothed Complexity Theory

    Get PDF
    Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng (J. ACM, 2004). Classical methods like worst-case or average-case analysis have accompanying complexity classes, like P and AvgP, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and compensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into computational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first hardness results (of bounded halting and tiling) and tractability results (binary optimization problems, graph coloring, satisfiability). Furthermore, we discuss extensions and shortcomings of our model and relate it to semi-random models.Comment: to be presented at MFCS 201

    Catches of Euxoa tritici in pheromone traps for Anarsia lineatella are due to the presence of (Z)-5-decenyl acetate as an impurity

    Get PDF
    Traps baited with the synthetic pheromone of Anarsia lineatella Zeller (Lepidoptera: Gelechiidae) frequently captured also Euxoa tritici L. males (Lepidoptera: Noctuidae) in field tests in Hungary. As (E)-monounsaturated compounds are uncommon among sex attractants or pheromone components of Noctuidae, it was hypothesized that the Euxoa catches may have been due to impurities of the (Z) isomer in synthetic (E)-5-decenyl acetate, which is the major component in the pheromone lure of A. lineatella. Traps baited with synthetic (Z)-5-decenyl acetate captured large numbers of E. tritici, and the compound showed a clear dose–response effect. Reanalysis of the synthetic batch of (E)-5-decenyl acetate used in preparation of the A. lineatella lure showed the presence of 10% of the (Z) isomer. Traps baited with synthetic (Z)-5-decenyl acetate can be used in the future for detection and monitoring purposes of E. tritici, a widely distributed pest of cereals and other field crops. The compound also showed attraction of Euxoa seliginis Duponche

    A multidimensional account of democratic legitimacy: how to make robust decisions in a non-idealized deliberative context

    Get PDF
    This paper analyses the possibility of granting legitimacy to democratic decisionmaking procedures in a context of deep pluralism. We defend a multidimensional account according to which a legitimate system needs to grant, on the one hand, that citizens should be included on an equal footing and acknowledged as reflexive political agents rather than mere beneficiaries of policies, and, on the other hand, that their decisions have an epistemic quality. While Estlund\u2019s account of imperfect epistemic proceduralism might seem to embody a dualistic conception of democratic legitimacy, we point out that it is not able to recognize citizens as reflexive political agents and is grounded in an idealized model of the circumstances of deliberation. To overcome these ambiguities, we develop an account of democratic legitimacy according to which disagreement is the proper expression of citizens\u2019 reflexive agency and the attribution of epistemic authority does not stem from a major expertise or specific ability, but it comes through the public confrontation among disagreeing agents. Consequently, the epistemic value of deliberation should be derived from the reasons-giving process rather than from the reference to the alleged quality of its outcomes. In this way, we demonstrate the validity of the multidimensional perspective of legitimacy, yet abstain from introducing any outcome-oriented criterion. Finally, we argue that this account of legitimacy is well suited for modeling deliberative democracy as a decision-making procedure that respects the agency of every citizen and grants her opportunity to influence public choices

    Improved limit on the directly measured antiproton lifetime

    Get PDF
    Continuous monitoring of a cloud of antiprotons stored in a Penning trap for 405 days enables us to set an improved limit on the directly measured antiproton lifetime. From our measurements we extract a storage time of 3.15x108 equivalent antiproton-seconds, resulting in a lower lifetime limit of Tp > 10.2,a with a confidence level of 68%. This result improves the limit on charge-parity-time violation in antiproton decays based on direct observation by a factor of 7

    A 16 Parts per Trillion Comparison of the Antiproton-to-Proton q/m Ratios

    Full text link
    The Standard Model (SM) of particle physics is both incredibly successful and glaringly incomplete. Among the questions left open is the striking imbalance of matter and antimatter in the observable universe which inspires experiments to compare the fundamental properties of matter/antimatter conjugates with high precision. Our experiments deal with direct investigations of the fundamental properties of protons and antiprotons, performing spectroscopy in advanced cryogenic Penning-trap systems. For instance, we compared the proton/antiproton magnetic moments with 1.5 ppb fractional precision, which improved upon previous best measurements by a factor of >3000. Here we report on a new comparison of the proton/antiproton charge-to-mass ratios with a fractional uncertainty of 16ppt. Our result is based on the combination of four independent long term studies, recorded in a total time span of 1.5 years. We use different measurement methods and experimental setups incorporating different systematic effects. The final result, −(q/m)p/(q/m)pˉ-(q/m)_{\mathrm{p}}/(q/m)_{\bar{\mathrm{p}}} = 1.000 000 000 003(16)1.000\,000\,000\,003 (16), is consistent with the fundamental charge-parity-time (CPT) reversal invariance, and improves the precision of our previous best measurement by a factor of 4.3. The measurement tests the SM at an energy scale of 1.96⋅10−27 1.96\cdot10^{-27}\,GeV (C..L.. 0.68), and improves 10 coefficients of the Standard Model Extension (SME). Our cyclotron-clock-study also constrains hypothetical interactions mediating violations of the clock weak equivalence principle (WEPcc_\text{cc}) for antimatter to a level of ∣αg−1∣<1.8⋅10−7|\alpha_{g}-1| < 1.8 \cdot 10^{-7}, and enables the first differential test of the WEPcc_\text{cc} using antiprotons \cite{hughes1991constraints}. From this interpretation we constrain the differential WEPcc_\text{cc}-violating coefficient to ∣αg,D−1∣<0.030|\alpha_{g,D}-1|<0.030

    BASE-STEP: A transportable antiproton reservoir for fundamental interaction studies

    Full text link
    Currently, the only worldwide source of low-energy antiprotons is the AD/ELENA facility located at CERN. To date, all precision measurements on single antiprotons have been conducted at this facility and provide stringent tests of the fundamental interactions and their symmetries. However, the magnetic field fluctuations from the facility operation limit the precision of upcoming measurements. To overcome this limitation, we have designed the transportable antiproton trap system BASE-STEP to relocate antiprotons to laboratories with a calm magnetic environment. We anticipate that the transportable antiproton trap will facilitate enhanced tests of CPT invariance with antiprotons, and provide new experimental possibilities of using transported antiprotons and other accelerator-produced exotic ions. We present here the technical design of the transportable trap system. This includes the transportable superconducting magnet, the cryogenic inlay consisting of the trap stack and the detection systems, and the differential pumping section to suppress the residual gas flow into the cryogenic trap chamber.Comment: To be submitted to Rev. Sci. Instrument

    The sacred and the profane: biotechnology, rationality, and public debate

    Get PDF
    Davies G, 2006. The definitive, peer-reviewed and edited version of this article is published in Environment and Planning A, 38(3), pp. 423 – 443 DOI: 10.1068/a37387This paper explores the forms of argumentation employed by participants in a recent public engagement process in the United Kingdom around new technologies for organ transplantation, with specific reference to xenotransplantation and stem-cell research. Two forms of reasoning recur throughout participants’ deliberations which challenge specialist framing of this issue. First, an often scatological humour and sense of the profane are evident in the ways in which participants discuss the bodily transformations that such technologies demand. Second, a sense of the sacred, in which new biotechnologies are viewed as against nature or in which commercial companies are ‘playing god’, is a repetitive and well-recognised concern. Such forms of reasoning are frequently dismissed by policymakers as ‘uninformed gut reactions’. Yet they also form a significant part of the repertoire of scientists themselves as they proclaim the hope of new medical breakthroughs, or seek to reconstruct ideas of the body to facilitate new biotechnological transformations. Through questioning of assumptions in Habermas’s notion of discourse ethics, and exploring the importance of hybridity and corporeality as concepts in ethical thinking, the author suggests that, far from being ill-formed opinions, such reasonings perform an important function for thinking through the ontological significance of the corporealisation of these proposed new forms of human and animal bodies

    Participatory-deliberative processes and public policy agendas:Lessons for policy and practice

    Get PDF
    open access journalParticipatory and deliberative processes have proliferated over recent decades in public administration. These seek to increase the effectiveness and democratic quality of policy making by involving citizens in policy. However, these have mainly operated at local levels of governance, and democratic theorists and practitioners have developed an ambition to scale these up in order to democratize higher tiers of government. This paper draws policy lessons from research on a “multi-level” process that held a similar ambition. The Sustainable Communities Act sought to integrate the results of various locally organized citizen deliberations within the policy development processes of central UK government. In doing so, it aimed to democratize central government problem definition and agenda-setting processes. The paper distinguishes between achievements and failures explained by process design, and more fundamental obstacles to do with broader contextual factors. As such, it identifies lessons for the amelioration of design features, while recognizing constraints that are often beyond the agency of local practitioners. The findings offer practical insights for policy workers and democratic reformers seeking to institutionalize participatory and deliberative innovations

    Linguistic foundations of heritage language development from the perspective of romance languages in Germany

    Get PDF
    This paper discusses the role of different factors determining the linguistic competence of heritage speakers (HSs) based on examples from speakers who speak a Romance language (French, Italian, Portuguese, or Spanish) as heritage language (HL) and German as the environmental language. Since the relative amount of contact with the HL and the environmental language may vary during the acquisition process, the role of language dominance (in terms of relative language proficiency) is of particular interest for HL development. In addition to dominance (and related to it), cross-linguistic influence (CLI) may have an influence on the outcome of HL acquisition. Finally, quality and quantity of input also determine HL acquisition and will be discussed in connection with heritage language education.info:eu-repo/semantics/publishedVersio
    • 

    corecore