4,045 research outputs found

    An improved lower bound for (1,<=2)-identifying codes in the king grid

    Full text link
    We call a subset CC of vertices of a graph GG a (1,)(1,\leq \ell)-identifying code if for all subsets XX of vertices with size at most \ell, the sets {cCuX,d(u,c)1}\{c\in C |\exists u \in X, d(u,c)\leq 1\} are distinct. The concept of identifying codes was introduced in 1998 by Karpovsky, Chakrabarty and Levitin. Identifying codes have been studied in various grids. In particular, it has been shown that there exists a (1,2)(1,\leq 2)-identifying code in the king grid with density 3/7 and that there are no such identifying codes with density smaller than 5/12. Using a suitable frame and a discharging procedure, we improve the lower bound by showing that any (1,2)(1,\leq 2)-identifying code of the king grid has density at least 47/111

    Meeting user needs for sea level rise information: a decision analysis perspective

    Get PDF
    Despite widespread efforts to implement climate services, there is almost no literature that systematically analyses users' needs. This paper addresses this gap by applying a decision analysis perspective to identify what kind of mean sea‐level rise (SLR) information is needed for local coastal adaptation decisions. We first characterize these decisions, then identify suitable decision analysis approaches and the sea‐level information required, and finally discuss if and how these information needs can be met given the state‐of‐the‐art of sea‐level science. We find that four types of information are needed: i) probabilistic predictions for short term decisions when users are uncertainty tolerant; ii) high‐end and low‐end SLR scenarios chosen for different levels of uncertainty tolerance; iii) upper bounds of SLR for users with a low uncertainty tolerance; and iv) learning scenarios derived from estimating what knowledge will plausibly emerge about SLR over time. Probabilistic predictions can only be attained for the near term (i.e., 2030‐2050) before SLR significantly diverges between low and high emission scenarios, for locations for which modes of climate variability are well understood and the vertical land movement contribution to local sea‐levels is small. Meaningful SLR upper bounds cannot be defined unambiguously from a physical perspective. Low to high‐end scenarios for different levels of uncertainty tolerance, and learning scenarios can be produced, but this involves both expert and user judgments. The decision analysis procedure elaborated here can be applied to other types of climate information that are required for mitigation and adaptation purposes

    Classical Logical versus Quantum Conceptual Thought: Examples in Economics, Decision theory and Concept Theory

    Full text link
    Inspired by a quantum mechanical formalism to model concepts and their disjunctions and conjunctions, we put forward in this paper a specific hypothesis. Namely that within human thought two superposed layers can be distinguished: (i) a layer given form by an underlying classical deterministic process, incorporating essentially logical thought and its indeterministic version modeled by classical probability theory; (ii) a layer given form under influence of the totality of the surrounding conceptual landscape, where the different concepts figure as individual entities rather than (logical) combinations of others, with measurable quantities such as 'typicality', 'membership', 'representativeness', 'similarity', 'applicability', 'preference' or 'utility' carrying the influences. We call the process in this second layer 'quantum conceptual thought', which is indeterministic in essence, and contains holistic aspects, but is equally well, although very differently, organized than logical thought. A substantial part of the 'quantum conceptual thought process' can be modeled by quantum mechanical probabilistic and mathematical structures. We consider examples of three specific domains of research where the effects of the presence of quantum conceptual thought and its deviations from classical logical thought have been noticed and studied, i.e. economics, decision theory, and concept theories and which provide experimental evidence for our hypothesis.Comment: 14 page

    Precision measurements of the top quark mass from the Tevatron in the pre-LHC era

    Full text link
    The top quark is the heaviest of the six quarks of the Standard Model. Precise knowledge of its mass is important for imposing constraints on a number of physics processes, including interactions of the as yet unobserved Higgs boson. The Higgs boson is the only missing particle of the Standard Model, central to the electroweak symmetry breaking mechanism and generation of particle masses. In this Review, experimental measurements of the top quark mass accomplished at the Tevatron, a proton-antiproton collider located at the Fermi National Accelerator Laboratory, are described. Topologies of top quark events and methods used to separate signal events from background sources are discussed. Data analysis techniques used to extract information about the top mass value are reviewed. The combination of several most precise measurements performed with the two Tevatron particle detectors, CDF and \D0, yields a value of \Mt = 173.2 \pm 0.9 GeV/c2c^2.Comment: This version contains the most up-to-date top quark mass averag
    corecore