499 research outputs found

    SurF: an innovative framework in biosecurity and animal health surveillance evaluation

    Get PDF
    Surveillance for biosecurity hazards is being conducted by the New Zealand Competent Authority, the Ministry for Primary Industries (MPI) to support New Zealand's biosecurity system. Surveillance evaluation should be an integral part of the surveillance life cycle, as it provides a means to identify and correct problems and to sustain and enhance the existing strengths of a surveillance system. The surveillance evaluation Framework (SurF) presented here was developed to provide a generic framework within which the MPI biosecurity surveillance portfolio, and all of its components, can be consistently assessed. SurF is an innovative, cross‐sectoral effort that aims to provide a common umbrella for surveillance evaluation in the animal, plant, environment and aquatic sectors. It supports the conduct of the following four distinct components of an evaluation project: (i) motivation for the evaluation, (ii) scope of the evaluation, (iii) evaluation design and implementation and (iv) reporting and communication of evaluation outputs. Case studies, prepared by MPI subject matter experts, are included in the framework to guide users in their assessment. Three case studies were used in the development of SurF in order to assure practical utility and to confirm usability of SurF across all included sectors. It is anticipated that the structured approach and information provided by SurF will not only be of benefit to MPI but also to other New Zealand stakeholders. Although SurF was developed for internal use by MPI, it could be applied to any surveillance system in New Zealand or elsewhere

    Exploring the Physical, Chemical and Biological Components of Soil: Improving Soil Health for Better Productive Capacity

    Get PDF
    “Soil health” is a term that is used to describe soil quality. The U.S. Department of Agriculture’s Natural Resources Conservation Service has defined soil health as “The continued capacity of soil to function as a vital living ecosystem that sustains plants, animals and humans (NRCS 2018).” For a farmer, soil health is the productive capacity of the soil, or the capacity of the soil to produce a crop or pasture. Healthy soils produce more and with better quality. Soil health is critical for water and nutrient cycling. Soil captures rainwater and stores it for use by plants. Soil health is important to improve both the amount of water and nutrients that a soil can hold, and the availability of water and nutrients for plants. The storage of water and nutrients and subsequent transfer to plants are critical determinants of the productive capacity of the soil, and the soil health. Here, we explore the fundamental components of soil, and how each component contributes to soil health and soil productive capacity

    Earliest Triassic microbialites in the South China Block and other areas; controls on their growth and distribution

    Get PDF
    Earliest Triassic microbialites (ETMs) and inorganic carbonate crystal fans formed after the end-Permian mass extinction (ca. 251.4 Ma) within the basal Triassic Hindeodus parvus conodont zone. ETMs are distinguished from rarer, and more regional, subsequent Triassic microbialites. Large differences in ETMs between northern and southern areas of the South China block suggest geographic provinces, and ETMs are most abundant throughout the equatorial Tethys Ocean with further geographic variation. ETMs occur in shallow-marine shelves in a superanoxic stratified ocean and form the only widespread Phanerozoic microbialites with structures similar to those of the Cambro-Ordovician, and briefly after the latest Ordovician, Late Silurian and Late Devonian extinctions. ETMs disappeared long before the mid-Triassic biotic recovery, but it is not clear why, if they are interpreted as disaster taxa. In general, ETM occurrence suggests that microbially mediated calcification occurred where upwelled carbonate-rich anoxic waters mixed with warm aerated surface waters, forming regional dysoxia, so that extreme carbonate supersaturation and dysoxic conditions were both required for their growth. Long-term oceanic and atmospheric changes may have contributed to a trigger for ETM formation. In equatorial western Pangea, the earliest microbialites are late Early Triassic, but it is possible that ETMs could exist in western Pangea, if well-preserved earliest Triassic facies are discovered in future work

    Gridded and direct Epoch of Reionisation bispectrum estimates using the Murchison Widefield Array

    Full text link
    We apply two methods to estimate the 21~cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly-spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uvuv-plane. The direct and gridded bispectrum estimators are applied to 21 hours of high-band (167--197~MHz; zz=6.2--7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 hours, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21~cm bispectrum may be accessible in less time than the 21~cm power spectrum for some wave modes, with detections in hundreds of hours.Comment: 19 pages, 10 figures, accepted for publication in PAS

    Ecological succession of a Jurassic shallow-water ichthyosaur fall.

    Get PDF
    After the discovery of whale fall communities in modern oceans, it has been hypothesized that during the Mesozoic the carcasses of marine reptiles created similar habitats supporting long-lived and specialized animal communities. Here, we report a fully documented ichthyosaur fall community, from a Late Jurassic shelf setting, and reconstruct the ecological succession of its micro- and macrofauna. The early 'mobile-scavenger' and 'enrichment-opportunist' stages were not succeeded by a 'sulphophilic stage' characterized by chemosynthetic molluscs, but instead the bones were colonized by microbial mats that attracted echinoids and other mat-grazing invertebrates. Abundant cemented suspension feeders indicate a well-developed 'reef stage' with prolonged exposure and colonization of the bones prior to final burial, unlike in modern whale falls where organisms such as the ubiquitous bone-eating worm Osedax rapidly destroy the skeleton. Shallow-water ichthyosaur falls thus fulfilled similar ecological roles to shallow whale falls, and did not support specialized chemosynthetic communities

    Banks' risk assessment of Swedish SMEs

    Get PDF
    Building on the literatures on asymmetric information and risk taking, this paper applies conjoint experiments to investigate lending officers' probabilities of supporting credit to established or existing SMEs. Using a sample of 114 Swedish lending officers, we test hypotheses concerning how information on the borrower's ability to repay the loan; alignment of risk preferences; and risk sharing affect their willingness to grant credit. Results suggest that features that reduce the risk to the bank and shift the risk to the borrower have the largest impact. The paper highlights the interaction between factors that influence the credit decision. Implications for SMEs, banks and research are discussed

    WSClean : an implementation of a fast, generic wide-field imager for radio astronomy

    Get PDF
    This article has been accepted for publication in Monthly Notices of the Royal Astronomical Society. © 2014 The Authors. Published by Oxford University Press on behalf of the Royal Astronomical Society.Astronomical widefield imaging of interferometric radio data is computationally expensive, especially for the large data volumes created by modern non-coplanar many-element arrays. We present a new widefield interferometric imager that uses the w-stacking algorithm and can make use of the w-snapshot algorithm. The performance dependencies of CASA's w-projection and our new imager are analysed and analytical functions are derived that describe the required computing cost for both imagers. On data from the Murchison Widefield Array, we find our new method to be an order of magnitude faster than w-projection, as well as being capable of full-sky imaging at full resolution and with correct polarisation correction. We predict the computing costs for several other arrays and estimate that our imager is a factor of 2-12 faster, depending on the array configuration. We estimate the computing cost for imaging the low-frequency Square-Kilometre Array observations to be 60 PetaFLOPS with current techniques. We find that combining w-stacking with the w-snapshot algorithm does not significantly improve computing requirements over pure w-stacking. The source code of our new imager is publicly released.Peer reviewedFinal Published versio

    High-energy sources at low radio frequency : the Murchison Widefield Array view of Fermi blazars

    Get PDF
    This is the accepted version of the following article: Giroletti, M. et al., A&A, 588 (2016) A141, which has been published in final form at DOI: http://dx.doi.org/10.1051/0004-6361/201527817. This article may be used for non-commercial purposes in accordance with the EDP Sciences self-archiving policies.Low-frequency radio arrays are opening a new window for the study of the sky, both to study new phenomena and to better characterize known source classes. Being flat-spectrum sources, blazars are so far poorly studied at low radio frequencies. We characterize the spectral properties of the blazar population at low radio frequency compare the radio and high-energy properties of the gamma-ray blazar population, and search for radio counterparts of unidentified gamma-ray sources. We cross-correlated the 6,100 deg^2 Murchison Widefield Array Commissioning Survey catalogue with the Roma blazar catalogue, the third catalogue of active galactic nuclei detected by Fermi-LAT, and the unidentified members of the entire third catalogue of gamma-ray sources detected by \fermilat. When available, we also added high-frequency radio data from the Australia Telescope 20 GHz catalogue. We find low-frequency counterparts for 186 out of 517 (36%) blazars, 79 out of 174 (45%) gamma-ray blazars, and 8 out of 73 (11%) gamma-ray blazar candidates. The mean low-frequency (120--180 MHz) blazar spectral index is αlow=0.57±0.02\langle \alpha_\mathrm{low} \rangle=0.57\pm0.02: blazar spectra are flatter than the rest of the population of low-frequency sources, but are steeper than at \simGHz frequencies. Low-frequency radio flux density and gamma-ray energy flux display a mildly significant and broadly scattered correlation. Ten unidentified gamma-ray sources have a (probably fortuitous) positional match with low radio frequency sources. Low-frequency radio astronomy provides important information about sources with a flat radio spectrum and high energy. However, the relatively low sensitivity of the present surveys still misses a significant fraction of these objects. Upcoming deeper surveys, such as the GaLactic and Extragalactic All-Sky MWA (GLEAM) survey, will provide further insight into this population.Peer reviewedFinal Published versio

    The absolute abundance calibration project: the <i>Lycopodium</i> marker-grain method put to the test

    Get PDF
    Traditionally, dinoflagellate cyst concentrations are calculated by adding an exotic marker or “spike” (such as Lycopodium clavatum) to each sample following the method of Stockmarr (1971). According to Maher (1981), the total error is controlled mainly by the error on the count of Lycopodium clavatum spores. In general, the more L. clavatum spores counted, the lower the error. A dinocyst / L. clavatum spore ratio of ~2 will give optimal results in terms of precision and time spent on a sample. It has also been proven that the use of the aliquot method yields comparable results to the marker-grain method (de Vernal et al., 1987). Critical evaluation of the effect of different laboratory procedures on the marker grain concentration in each sample has never been executed. Although, it has been reported that different processing methods (e.g. ultrasonication, oxidizing, etc.) are to a certain extent damaging to microfossils (e.g. Hodgkinson, 1991), it is not clear how this is translated into concentration calculations. It is wellknown from the literature that concentration calculations of dinoflagellate cysts from different laboratories are hard to resolve into a consistent picture. The aim of this study is to remove these inconsistencies and to make recommendations for the use of a standardized methodology. Sediment surface samples from four different localities (North Sea, Celtic Sea, NW Africa and Benguela) were macerated in different laboratories each using its own palynological maceration technique. A fixed amount of Lycopodium clavatum tablets was added to each sample. The uses of different preparation methodologies (sieving, ultrasonicating, oxidizing …) are compared using both concentrations – calculated from Lycopodium tablets - and relative abundances (more destructive methods will increase the amount of resistant taxa). Additionally, this study focuses on some important taxonomic issues, since obvious interlaboratorial differences in nomenclature are recorded

    Adaptive hypermedia driven serious game design and cognitive style in school settings: an exploratory study

    Get PDF
    The potential value of adaptive hypermedia and game based learning to education and training has long been recognised, numerous studies have been undertaken in both those areas investigating its potential to improve learner performance. In particular research has indicated that tailoring content to match the prior knowledge of the user has the power to increase the effectiveness of learning systems. Recent studies have begun to indicate that Adaptive Hypermedia Learning Systems (AHLS) based on cognitive styles have the power to improve learner performance. Recent examples of research exploring avenues for effectively incorporating serious games into AHLS indicated that integrating serious games into a personalized learning environment has the potential educational benefits of combining a personalized delivery with increased learner motivation. The exploratory study presented in this paper here developed an Adaptive Hypermedia Driven Serious Game (AHDSG) based around Pask’s Holist-Serialist dimension of cognitive style. A prototype AHDSG was designed and developed to teach students about Sutton Hoo and archaeological methods. Sixty-six secondary school students participated in this study. Overall the findings of this study show that there was an improvement in performance among all participants. Although the participants that used the system which adapted to their preferred cognitive style achieved a higher mean gain score, the difference was not significant
    corecore