639 research outputs found

    Improving Links Between Science and Coastal Management: Results of a Survey to Assess U.S. State Coastal Management Science and Technology Needs

    Get PDF
    In Winter 2003/2004 the Coastal States Organization (CSO) sponsored a national survey of state coastal resource managers to better understand their science and technology needs. The web-based survey was sponsored by CSO with funding provided by the Cooperative Institute for Coastal and Estuarine Environmental Technology (CICEET) at the University of New Hampshire. This survey builds upon a previous survey conducted by CSO in 1999. CSO contracted with the Urban Harbors Institute (UHI) at UMass-Boston to prepare the survey questions and final report. The University of New Hampshire Survey Center was contracted to conduct the survey and analyze the results. Two hundred thirty (230) respondents completed the survey from 33 states, territories and Commonwealths. Organizations participating in this survey included the Coastal States Organization (CSO), National Estuarine Research Reserve Association (NERRA), Association of National Estuary Programs (ANEP), Association of State Floodplain Managers (ASFPM), Association of State Wetland Managers (ASWM), Association of State and Interstate Water Pollution Control Administrators (ASIWPCA), and the Atlantic States Fishery Management Commission (ASFMC). While some analysis of the data and recommendations how on the report should be used are provided, this report is not intended to offer specific interpretations of the results. Rather it is intended to raise awareness on those topics, research, information, and technology needs that are important to coastal resource managers for the purpose of initiating further dialogue on what exactly this data means and how it can best be applied to improve our future efforts

    State Coastal Observations and Monitoring Needs: Results of a Survey to Assess Coastal Management Needs (DRAFT REPORT)

    Get PDF
    The success of the U.S. Coastal Ocean Observing System will be measured, in part, by how well the needs of the coastal management community are being addressed. The results of this survey indicate that the two most important management issues facing coastal programs are land use and habitat change. It is essential that the planning and implementation of the USCOOS take this fact into account and place a priority on addressing these high priority management needs. This can only be accomplished through the direct long-term involvement of the coastal management community with USCOOS efforts at the national and regional levels. By working together on this survey, SEACOOS and the coastal management community have demonstrated one way that coastal science and management can be focused on a common goal

    Synergies and Prospects for Early Resolution of the Neutrino Mass Ordering

    Full text link
    The measurement of neutrino Mass Ordering (MO) is a fundamental element for the understanding of leptonic flavour sector of the Standard Model of Particle Physics. Its determination relies on the precise measurement of Δm312\Delta m^2_{31} and Δm322\Delta m^2_{32} using either neutrino vacuum oscillations, such as the ones studied by medium baseline reactor experiments, or matter effect modified oscillations such as those manifesting in long-baseline neutrino beams (LBν\nuB) or atmospheric neutrino experiments. Despite existing MO indication today, a fully resolved MO measurement (\geq5σ\sigma) is most likely to await for the next generation of neutrino experiments: JUNO, whose stand-alone sensitivity is \sim3σ\sigma, or LBν\nuB experiments (DUNE and Hyper-Kamiokande). Upcoming atmospheric neutrino experiments are also expected to provide precious information. In this work, we study the possible context for the earliest full MO resolution. A firm resolution is possible even before 2028, exploiting mainly vacuum oscillation, upon the combination of JUNO and the current generation of LBν\nuB experiments (NOvA and T2K). This opportunity is possible thanks to a powerful synergy boosting the overall sensitivity where the sub-percent precision of Δm322\Delta m^2_{32} by LBν\nuB experiments is found to be the leading order term for the MO earliest discovery. We also found that the comparison between matter and vacuum driven oscillation results enables unique discovery potential for physics beyond the Standard Model.Comment: Entitled in arXiv:2008.11280v1 as "Earliest Resolution to the Neutrino Mass Ordering?

    Catching Element Formation In The Act

    Full text link
    Gamma-ray astronomy explores the most energetic photons in nature to address some of the most pressing puzzles in contemporary astrophysics. It encompasses a wide range of objects and phenomena: stars, supernovae, novae, neutron stars, stellar-mass black holes, nucleosynthesis, the interstellar medium, cosmic rays and relativistic-particle acceleration, and the evolution of galaxies. MeV gamma-rays provide a unique probe of nuclear processes in astronomy, directly measuring radioactive decay, nuclear de-excitation, and positron annihilation. The substantial information carried by gamma-ray photons allows us to see deeper into these objects, the bulk of the power is often emitted at gamma-ray energies, and radioactivity provides a natural physical clock that adds unique information. New science will be driven by time-domain population studies at gamma-ray energies. This science is enabled by next-generation gamma-ray instruments with one to two orders of magnitude better sensitivity, larger sky coverage, and faster cadence than all previous gamma-ray instruments. This transformative capability permits: (a) the accurate identification of the gamma-ray emitting objects and correlations with observations taken at other wavelengths and with other messengers; (b) construction of new gamma-ray maps of the Milky Way and other nearby galaxies where extended regions are distinguished from point sources; and (c) considerable serendipitous science of scarce events -- nearby neutron star mergers, for example. Advances in technology push the performance of new gamma-ray instruments to address a wide set of astrophysical questions.Comment: 14 pages including 3 figure

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    STRAW-b (STRings for Absorption length in Water-b): the second pathfinder mission for the Pacific Ocean Neutrino Experiment

    Full text link
    Since 2018, the potential for a high-energy neutrino telescope, named the Pacific Ocean Neutrino Experiment (P-ONE), has been thoroughly examined by two pathfinder missions, STRAW and STRAW-b, short for short for Strings for Absorption Length in Water. The P-ONE project seeks to install a neutrino detector with a one cubic kilometer volume in the Cascadia Basin's deep marine surroundings, situated near the western shores of Vancouver Island, Canada. To assess the environmental conditions and feasibility of constructing a neutrino detector of that scale, the pathfinder missions, STRAW and STRAW-b, have been deployed at a depth of 2.7 km within the designated site for P-ONE and were connected to the NEPTUNE observatory, operated by Ocean Networks Canada (ONC). While STRAW focused on analyzing the optical properties of water in the Cascadia Basin, \ac{strawb} employed cameras and spectrometers to investigate the characteristics of bioluminescence in the deep-sea environment. This report introduces the STRAW-b concept, covering its scientific objectives and the instrumentation used. Furthermore, it discusses the design considerations implemented to guarantee a secure and dependable deployment process of STRAW-b. Additionally, it showcases the data collected by battery-powered loggers, which monitored the mechanical stress on the equipment throughout the deployment. The report also offers an overview of STRAW-b's operation, with a specific emphasis on the notable advancements achieved in the data acquisition (DAQ) system and its successful integration with the server infrastructure of ONC.Comment: 20 pages, 11 figures, 2 table

    The application of adjuvant autologous antravesical macrophage cell therapy vs. BCG in non-muscle invasive bladder cancer: a multicenter, randomized trial

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>While adjuvant immunotherapy with Bacille Calmette Guérin (BCG) is effective in non-muscle-invasive bladder cancer (BC), adverse events (AEs) are considerable. Monocyte-derived activated killer cells (MAK) are discussed as essential in antitumoural immunoresponse, but their application may imply risks. The present trial compared autologous intravesical macrophage cell therapy (BEXIDEM<sup>®</sup>) to BCG in patients after transurethral resection (TURB) of BC.</p> <p>Materials and methods</p> <p>This open-label trial included 137 eligible patients with TaG1-3, T1G1-2 plurifocal or unifocal tumours and ≥ 2 occurrences within 24 months and was conducted from June 2004 to March 2007. Median follow-up for patients without recurrence was 12 months. Patients were randomized to BCG or mononuclear cells collected by apheresis after ex vivo cell processing and activation (BEXIDEM). Either arm treatment consisted of 6 weekly instillations and 2 cycles of 3 weekly instillations at months 3 and 6. Toxicity profile (primary endpoint) and prophylactic effects (secondary endpoint) were assessed.</p> <p>Results</p> <p>Patient characteristics were evenly distributed. Of 73 treated with BCG and 64 with BEXIDEM, 85% vs. 45% experienced AEs and 26% vs. 14% serious AEs (SAE), respectively (p < 0.001). Recurrence occurred significantly less frequent with BCG than with BEXIDEM (12% vs. 38%; p < 0.001).</p> <p>Discussion</p> <p>This initial report of autologous intravesical macrophage cell therapy in BC demonstrates BEXIDEM treatment to be safe. Recurrence rates were significantly lower with BCG however. As the efficacy of BEXIDEM remains uncertain, further data, e.g. marker lesions studies, are warranted.</p> <p>Trial registration</p> <p>The trial has been registered in the ISRCTN registry <url>http://isrctn.org</url> under the registration number ISRCTN35881130.</p

    Explaining Institutional Change: Why Elected Politicians Implement Direct Democracy

    Get PDF
    In existing models of direct democratic institutions, the median voter benefits, but representative politicians are harmed since their policy choices can be overridden. This is a puzzle, since representative politicians were instrumental in creating these institutions. I build a model of direct democracy that explains why a representative might benefit from tying his or her own hands in this way. The key features are (1) that voters are uncertain about their representative's preferences; (2) that direct and representative elections are complementary ways for voters to control outcomes. The model shows that some politicians benefit from the introduction of direct democracy, since they are more likely to survive representative elections: direct democracy credibly prevents politicians from realising extreme outcomes. Historical evidence from the introduction of the initiative, referendum and recall in America broadly supports the theory, which also explains two empirical results that have puzzled scholars: legislators are trusted less, but reelected more, in US states with direct democracy. I conclude by discussing the potential for incomplete information and signaling models to improve our understanding of institutional change more generally
    corecore