184 research outputs found

    An adaptive agent model for analysing co-evolution of management and policies in a complex rangeland system

    Get PDF
    This paper describes an adaptive agent model of rangelands based on concepts of complex adaptive systems. The behavioural and biological processes of pastoralists, regulators, livestock, grass and shrubs are modelled as well as the interactions between these components. The evolution of the rangeland system is studied under different policy and institutional regimes that affect the behaviour and learning of pastoralists, and hence the state of the ecological system. Adaptive agent models show that effective learning and effective ecosystem management do not necessarily coincide and can suggest potentially useful alternatives to the design of policies and institutions. (C) 2000 Elsevier Science B.V

    GRACKLE: a chemistry and cooling library for astrophysics

    Get PDF
    We present the Grackle chemistry and cooling library for astrophysical simulations and models. Grackle provides a treatment of non-equilibrium primordial chemistry and cooling for H, D, and He species, including H2 formation on dust grains; tabulated primordial and metal cooling; multiple UV background models; and support for radiation transfer and arbitrary heat sources. The library has an easily implementable interface for simulation codes written in C, C++, and Fortran as well as a Python interface with added convenience functions for semi-analytical models. As an open-source project, Grackle provides a community resource for accessing and disseminating astrochemical data and numerical methods. We present the full details of the core functionality, the simulation and Python interfaces, testing infrastructure, performance, and range of applicability. Grackle is a fully open-source project and new contributions are welcome.Comment: 20 pages, 8 figures, accepted for publication in MNRAS. For more info, visit grackle.readthedocs.i

    The Suspected CANcer (SCAN) pathway::protocol for evaluating a new standard of care for patients with non-specific symptoms of cancer

    Get PDF
    Introduction Cancer survival in England lags behind most European countries, due partly to lower rates of early stage diagnosis. We report the protocol for the evaluation of a multidisciplinary diagnostic centre-based pathway for the investigation of ‘low-risk but not no-risk’ cancer symptoms called the Suspected CANcer (SCAN) pathway. SCAN is a new standard of care being implemented in Oxfordshire; one of a number of pathways implemented during the second wave of the Accelerate, Coordinate, Evaluate (ACE) programme, an initiative which aims to improve England’s cancer survival rates through establishing effective routes to early diagnosis. Methods and analysis To evaluate SCAN, we are collating a prospective database of patients referred onto the pathway by their general practitioner (GP). Patients aged over 40 years, with non-specific symptoms such as weight loss or fatigue, who do not meet urgent cancer referral criteria or for whom symptom causation remains unclear after investigation via other existing pathways, can be referred to SCAN. SCAN provides rapid CT scanning, laboratory testing and clinic review within 2 weeks. We will follow all patients in the primary and secondary care record for at least 2 years. The data will be used to understand the diagnostic yield of the SCAN pathway in the short term (28 days) and the long term (2 years). Routinely collected primary and secondary care data from patients not referred to SCAN but with similar symptoms will also be used to evaluate SCAN. We will map the routes to diagnosis for patients referred to SCAN to assess cost-effectiveness. Acceptability will be evaluated using patient and GP surveys. Ethics and dissemination The Oxford Joint Research Office Study Classification Group has judged this to be a service evaluation and so outside of research governance. The results of this project will be disseminated by peer-reviewed publication and presentation at conferences

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Extended Sentinel Monitoring of Helicoverpa zea Resistance to Cry and Vip3Aa Toxins in Bt Sweet Corn: Assessing Changes in Phenotypic and Allele Frequencies of Resistance

    Get PDF
    Transgenic corn and cotton that produce Cry and Vip3Aa toxins derived from Bacillus thuringiensis (Bt) are widely planted in the United States to control lepidopteran pests. The sustainability of these Bt crops is threatened because the corn earworm/bollworm, Helicoverpa zea (Boddie), is evolving a resistance to these toxins. Using Bt sweet corn as a sentinel plant to monitor the evolution of resistance, collaborators established 146 trials in twenty-five states and five Canadian provinces during 2020–2022. The study evaluated overall changes in the phenotypic frequency of resistance (the ratio of larval densities in Bt ears relative to densities in non-Bt ears) in H. zea populations and the range of resistance allele frequencies for Cry1Ab and Vip3Aa. The results revealed a widespread resistance to Cry1Ab, Cry2Ab2, and Cry1A.105 Cry toxins, with higher numbers of larvae surviving in Bt ears than in non-Bt ears at many trial locations. Depending on assumptions about the inheritance of resistance, allele frequencies for Cry1Ab ranged from 0.465 (dominant resistance) to 0.995 (recessive resistance). Although Vip3Aa provided high control efficacy against H. zea, the results show a notable increase in ear damage and a number of surviving older larvae, particularly at southern locations. Assuming recessive resistance, the estimated resistance allele frequencies for Vip3Aa ranged from 0.115 in the Gulf states to 0.032 at more northern locations. These findings indicate that better resistance management practices are urgently needed to sustain efficacy the of corn and cotton that produce Vip3Aa

    THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT. II. ISOLATED DISK TEST

    Get PDF
    Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, we find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ∌3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes
    • 

    corecore