100 research outputs found

    Rhetoricizing Habermas: The Restoration of Legitimacy as a Theme in the 1992 Televised Presidential Debates.

    Get PDF
    During the past twenty years, scholars have posited the emergence of a legitimacy crisis in the American political system. Symptoms of the crisis were low voter turn out and a culture of withdrawal, cynicism, alienation, and a widespread perception of institutional incompetence and indifference. At the very least, the widespread mood of apathy and decline have been seized upon by various candidates seeking political office, in particular the presidency, who routinely engage in discourse targeting legitimacy restoration. This discourse echoed the general theme of the Jeffersonian Myth. This myth, which predates Jefferson in its old Roman roots, targets the citizen as the primary source of political power and moral authority. Working from Habermas\u27 writings regarding legitimacy crises and his ideal speech situation, this study developed three legitimacy topoi which were used as a critical method for understanding candidate discourse. These topoi were used to explore the discourse of the 1992 televised presidential debates. The debates were selected because of their economy of statements and voter impact, and because legitimacy had become a central theme of the 1992 elections. The study found that the third party candidate indicted the legitimacy of the system and argued for restoration far more than the other two candidates. The incumbent used legitimacy appeals the least. The exhaling Democratic challenger affirmed and vilified the legitimacy of the government showing that rhetorical strategy and logic do not always coincide

    In-situ observation of nucleation and property evolution in films grown with an atmospheric pressure spatial atomic layer deposition system.

    Get PDF
    Atmospheric pressure—spatial atomic layer deposition (AP-SALD) is a promising open-air deposition technique for high-throughput manufacturing of nanoscale films, yet the nucleation and property evolution in these films has not been studied in detail. In this work, in situ reflectance spectroscopy was implemented in an AP-SALD system to measure the properties of Zinc oxide (ZnO) and Aluminum oxide (Al2O3) films during their deposition. For the first time, this revealed a substrate nucleation period for this technique, where the length of the nucleation time was sensitive to the deposition parameters. The in situ characterization of thickness showed that varying the deposition parameters can achieve a wide range of growth rates (0.1–3 nm/cycle), and the evolution of optical properties throughout film growth was observed. For ZnO, the initial bandgap increased when deposited at lower temperatures and subsequently decreased as the film thickness increased. Similarly, for Al2O3 the refractive index was lower for films deposited at a lower temperature and subsequently increased as the film thickness increased. Notably, where other implementations of reflectance spectroscopy require previous knowledge of the film's optical properties to fit the spectra to optical dispersion models, the approach developed here utilizes a large range of initial guesses that are inputted into a Levenberg-Marquardt fitting algorithm in parallel to accurately determine both the film thickness and complex refractive index

    Status Report of the DPHEP Study Group: Towards a Global Effort for Sustainable Data Preservation in High Energy Physics

    Full text link
    Data from high-energy physics (HEP) experiments are collected with significant financial and human effort and are mostly unique. An inter-experimental study group on HEP data preservation and long-term analysis was convened as a panel of the International Committee for Future Accelerators (ICFA). The group was formed by large collider-based experiments and investigated the technical and organisational aspects of HEP data preservation. An intermediate report was released in November 2009 addressing the general issues of data preservation in HEP. This paper includes and extends the intermediate report. It provides an analysis of the research case for data preservation and a detailed description of the various projects at experiment, laboratory and international levels. In addition, the paper provides a concrete proposal for an international organisation in charge of the data management and policies in high-energy physics

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Methylome Analysis and Epigenetic Changes Associated with Menarcheal Age

    Get PDF
    CAD received funding from EU-Europe aid grant CRIS 2009/223–507.The EPIC cohort is supported by the Europe Against Cancer Program of the European Commission (SANCO). The individual centres also received funding from: Denmark (Danish Cancer Society); France (Ligue centre le Cancer, Institut Gustave Roussy, Mutuelle Ge´ne´rale de l’Education Nationale, and Institut National de la Sante´ et de la Recherche Me´dicale (INSERM)); Greece (Hellenic Ministry of Health, the Stavros Niarchos Foundation and the Hellenic Health Foundation); Germany (German Cancer Aid, German Cancer Research Center, and Federal Ministry of Education and Research (Grant 01-EA-9401)); Italy (Italian Association for Research on Cancer and the National Research Council); The Netherlands (Dutch Ministry of Public Health, Welfare and Sports (VWS), Netherlands Cancer Registry (NKR), LK Research Funds, Dutch Prevention Funds, and Dutch ZON (Zorg Onderzoek Nederland), World Cancer Research Fund (WCRF)); Spain (Health Research Fund (FIS) of the Spanish Ministry of Health (Exp 96/0032) and the participating regional governments and institutions); Sweden (Swedish Cancer Society, Swedish Scientific Council, and Regional Government of Skane); and the United Kingdom (Cancer Research UK and Medical Research Council UK and Breast Cancer Campaign). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

    Engaging Research with Policy and Action: What are the Challenges of Responding to Zoonotic Disease in Africa?

    Get PDF
    Zoonotic diseases will maintain a high level of public policy attention in the coming decades. From the spectre of a global pandemic to anxieties over agricultural change, urbanization, social inequality and threats to natural ecosystems, effectively preparing and responding to endemic and emerging diseases will require technological, institutional and social innovation. Much current discussion emphasizes the need for a ‘One Health’ approach: bridging disciplines and sectors to tackle these complex dynamics. However, as attention has increased, so too has an appreciation of the practical challenges in linking multi-disciplinary, multi-sectoral research with policy, action and impact. In this commentary paper, we reflect on these issues with particular reference to the African sub-continent. We structure the themes of our analysis on the existing literature, expert opinion and 11 interviews with leading One Health scholars and practitioners, conducted at an international symposium in 2016. We highlight a variety of challenges in research and knowledge production, in the difficult terrain of implementation and outreach, and in the politicized nature of decision-making and priority setting. We then turn our attention to a number of strategies that might help reconfigure current pathways and accepted norms of practice. These include: (i) challenging scientific expertise; (ii) strengthening national multi-sectoral coordination; (iii) building on what works; and (iv) re-framing policy narratives. We argue that bridging the research-policy-action interface in Africa, and better connecting zoonoses, ecosystems and well-being in the twenty-first century, will ultimately require greater attention to the democratization of science and public policy. This article is part of the themed issue ‘One Health for a changing world: zoonoses, ecosystems and human well-being’

    A proposed systems approach to the evaluation of integrated palliative care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is increasing global interest in regional palliative care networks (PCN) to integrate care, creating systems that are more cost-effective and responsive in multi-agency settings. Networks are particularly relevant where different professional skill sets are required to serve the broad spectrum of end-of-life needs. We propose a comprehensive framework for evaluating PCNs, focusing on the nature and extent of inter-professional collaboration, community readiness, and client-centred care.</p> <p>Methods</p> <p>In the absence of an overarching structure for examining PCNs, a framework was developed based on previous models of health system evaluation, explicit theory, and the research literature relevant to PCN functioning. This research evidence was used to substantiate the choice of model factors.</p> <p>Results</p> <p>The proposed framework takes a systems approach with system structure, process of care, and patient outcomes levels of consideration. Each factor represented makes an independent contribution to the description and assessment of the network.</p> <p>Conclusions</p> <p>Realizing palliative patients' needs for complex packages of treatment and social support, in a seamless, cost-effective manner, are major drivers of the impetus for network-integrated care. The framework proposed is a first step to guide evaluation to inform the development of appropriate strategies to further promote collaboration within the PCN and, ultimately, optimal palliative care that meets patients' needs and expectations.</p

    A community-based geological reconstruction of Antarctic Ice Sheet deglaciation since the Last Glacial Maximum

    Get PDF
    A robust understanding of Antarctic Ice Sheet deglacial history since the Last Glacial Maximum is important in order to constrain ice sheet and glacial-isostatic adjustment models, and to explore the forcing mechanisms responsible for ice sheet retreat. Such understanding can be derived from a broad range of geological and glaciological datasets and recent decades have seen an upsurge in such data gathering around the continent and Sub-Antarctic islands. Here, we report a new synthesis of those datasets, based on an accompanying series of reviews of the geological data, organised by sector. We present a series of timeslice maps for 20ka, 15ka, 10ka and 5ka, including grounding line position and ice sheet thickness changes, along with a clear assessment of levels of confidence. The reconstruction shows that the Antarctic Ice sheet did not everywhere reach the continental shelf edge at its maximum, that initial retreat was asynchronous, and that the spatial pattern of deglaciation was highly variable, particularly on the inner shelf. The deglacial reconstruction is consistent with a moderate overall excess ice volume and with a relatively small Antarctic contribution to meltwater pulse 1a. We discuss key areas of uncertainty both around the continent and by time interval, and we highlight potential priorit. © 2014 The Authors
    corecore