13 research outputs found

    The first successful application of Optically Stimulated Luminescence Dating to a colonial era (<0.25ka) archaeological site in Australia

    Get PDF
    While exploration of Australian post-colonial (≤0.25 ka) OSL dating is well established in a range of natural sedimentary contexts (e.g. fluvial, aeolian, coastal), to date there have been no successful examples of the technique applied to archaeological sediments of this era. Here we present the results of a multi-phase compliance-based archaeological excavations of a new bridge crossing the Hawkesbury-Nepean River (northwest Sydney). These works identified a Last Glacial Maximum (LGM) aeolian deposit through which a colonial era drainage system had been excavated. Historical documents reveal the construction of the system occurred between 1814 and 1816 CE. An opportunistic range-finding Optically Stimulated Luminescence (OSL) sample was obtained from anthropogenic trench backfill – composed of reworked LGM deposits – immediately above the drainage system. Minimum and Finite Mixture age models of single grain quartz OSL provided a date of 1826 CE (1806-1846 CE), in close agreement with the documented age of construction. These findings provide the first evidence of a colonial structure reliably dated using OSL, and demonstrate the feasibility of wider deployment of OSL dating to other archaeological sites of the recent era (≤0.25 ka). We propose that such environments associated with large volumes of sand-rich backfill, in particular, likely heighten OSL dating success. We propose that well-documented historical archaeological sites in Australia also have the potential to provide a robust testing ground for further evaluating the accuracy of OSL dating in a range of young archaeological sedimentary contexts, potentially to sub-decadal levels

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    An indoor path loss prediction model using wall correction factors for wireless local area network and 5G indoor networks

    Get PDF
    A modified indoor path loss prediction model is presented, namely, effective wall loss model. The modified model is compared to other indoor path loss prediction models using simulation data and real-time measurements. Different operating frequencies and antenna polarizations are considered to verify the observations. In the simulation part, effective wall loss model shows the best performance among other models as it outperforms 2 times the dual-slope model, which is the second best performance. Similar observations were recorded from the experimental results. Linear attenuation and one-slope models have similar behavior, the two models parameters show dependency on operating frequency and antenna polarization

    Effects of dietary supplementation with Clostridium butyricum on the growth performance and humoral immune response in Miichthys miiuy

    No full text
    The effects of dietary supplementation with Clostridium butyricum on growth performance and humoral immune response in Miichthys miiuy were evaluated. One hundred and fifty Miichthys miiuy weighing approximately 200~260 g were divided into five groups and reared in 15 tanks with closed circuiting culture system. The animals were fed 5 diets: basal diet only (control) or supplemented of the basal diet with C. butyricum at doses of 10(3) (CB1), 10(5) (CB2), 10(7) (CB3) or 10(9) (CB4) CFU/g. Compared with the control, the serum phenoloxidase activity was significantly increased by the supplementation (P<0.05), acid phosphatases activity was increased significantly (P<0.05) at the doses of 10(9) CFU/g. Serum lysozyme activity peaked at dose of 10(7) CFU/g and in the skin mucus at dose of 10(9) CFU/g. Immunoglobulin M level in the serum and skin mucus was increased except at dose of 10(3) CFU/g (P<0.05). The growth at the dose of 10(9) CFU/g was higher than that of the control (P<0.05). It is concluded that supplementation of C. butyricum can mediate the humoral immune responses and improve the growth performance in Miichthys miiuy

    Trematodes and snails: an intimate association

    No full text

    Targeting Membrane Receptors of Ovarian Cancer Cells for Therapy

    No full text

    Critical care usage after major gastrointestinal and liver surgery: a prospective, multicentre observational study

    No full text
    Background Patient selection for critical care admission must balance patient safety with optimal resource allocation. This study aimed to determine the relationship between critical care admission, and postoperative mortality after abdominal surgery. Methods This prespecified secondary analysis of a multicentre, prospective, observational study included consecutive patients enrolled in the DISCOVER study from UK and Republic of Ireland undergoing major gastrointestinal and liver surgery between October and December 2014. The primary outcome was 30-day mortality. Multivariate logistic regression was used to explore associations between critical care admission (planned and unplanned) and mortality, and inter-centre variation in critical care admission after emergency laparotomy. Results Of 4529 patients included, 37.8% (n=1713) underwent planned critical care admissions from theatre. Some 3.1% (n=86/2816) admitted to ward-level care subsequently underwent unplanned critical care admission. Overall 30-day mortality was 2.9% (n=133/4519), and the risk-adjusted association between 30-day mortality and critical care admission was higher in unplanned [odds ratio (OR): 8.65, 95% confidence interval (CI): 3.51–19.97) than planned admissions (OR: 2.32, 95% CI: 1.43–3.85). Some 26.7% of patients (n=1210/4529) underwent emergency laparotomies. After adjustment, 49.3% (95% CI: 46.8–51.9%, P<0.001) were predicted to have planned critical care admissions, with 7% (n=10/145) of centres outside the 95% CI. Conclusions After risk adjustment, no 30-day survival benefit was identified for either planned or unplanned postoperative admissions to critical care within this cohort. This likely represents appropriate admission of the highest-risk patients. Planned admissions in selected, intermediate-risk patients may present a strategy to mitigate the risk of unplanned admission. Substantial inter-centre variation exists in planned critical care admissions after emergency laparotomies
    corecore