40 research outputs found

    Effects of Land Crabs on Leaf Litter Distributions and Accumulations in a Mainland Tropical Rain Forest 1

    Full text link
    The effect of the fossorial land crab Gecarcinus quadratus (Gecarcinidae) on patterns of accumulation and distribution of leaf litter was studied for two years in the coastal primary forests of Costa Rica's Corcovado National Park. Within this mainland forest, G, quadratus achieve densities up to 6 crabs/m 2 in populations extending along the Park's Pacific coastline and inland for ca 600 m. Crabs selectively forage for fallen leaf litter and relocate what they collect to burrow chambers that extend from 15 to 150 cm deep ( N = 44), averaging (±SE) 48.9 ± 3.0 cm. Preference trials suggested that leaf choice by crabs may be species-specific. Excavated crab burrows revealed maximum leaf collections of 11.75 g dry mass– 2.5 times more leaf litter than collected by square-meter leaf fall traps over several seven-day sampling periods. Additionally, experimental crab exclosures (25 m 2 ) were established using a repeated measures randomized block design to test for changes in leaf litter as a function of reduced crab density. Exclosures accumulated significantly more (5.6 ± 3.9 times) leaf litter than did control treatments during the wet, but not the dry, seasons over this two-year study. Such extensive litter relocation by land crabs may affect profiles of soil organic carbon, rooting, and seedling distributions.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/73250/1/j.1744-7429.2003.tb00590.x.pd

    Genomic and Geographic Context for the Evolution of High-Risk Carbapenem-Resistant Enterobacter cloacae Complex Clones ST171 and ST78

    Get PDF
    Recent reports have established the escalating threat of carbapenem-resistant Enterobacter cloacae complex (CREC). Here, we demonstrate that CREC has evolved as a highly antibiotic-resistant rather than highly virulent nosocomial pathogen. Applying genomics and Bayesian phylogenetic analyses to a 7-year collection of CREC isolates from a northern Manhattan hospital system and to a large set of publicly available, geographically diverse genomes, we demonstrate clonal spread of a single clone, ST171. We estimate that two major clades of epidemic ST171 diverged prior to 1962, subsequently spreading in parallel from the Northeastern to the Mid-Atlantic and Midwestern United States and demonstrating links to international sites. Acquisition of carbapenem and fluoroquinolone resistance determinants by both clades preceded widespread use of these drugs in the mid-1980s, suggesting that antibiotic pressure contributed substantially to its spread. Despite a unique mobile repertoire, ST171 isolates showed decreased virulence in vitro. While a second clone, ST78, substantially contributed to the emergence of CREC, it encompasses diverse carbapenemase-harboring plasmids, including a potentially hypertransmissible IncN plasmid, also present in other sequence types. Rather than heightened virulence, CREC demonstrates lineage-specific, multifactorial adaptations to nosocomial environments coupled with a unique potential to acquire and disseminate carbapenem resistance genes. These findings indicate a need for robust surveillance efforts that are attentive to the potential for local and international spread of high-risk CREC clones. IMPORTANCE Carbapenem-resistant Enterobacter cloacae complex (CREC) has emerged as a formidable nosocomial pathogen. While sporadic acquisition of plasmid-encoded carbapenemases has been implicated as a major driver of CREC, ST171 and ST78 clones demonstrate epidemic potential. However, a lack of reliable genomic references and rigorous statistical analyses has left many gaps in knowledge regarding the phylogenetic context and evolutionary pathways of successful CREC. Our reconstruction of recent ST171 and ST78 evolution represents a significant addition to current understanding of CREC and the directionality of its spread from the Eastern United States to the northern Midwestern United States with links to international collections. Our results indicate that the remarkable ability of E. cloacae to acquire and disseminate cross-class antibiotic resistance rather than virulence determinants, coupled with its ability to adapt under conditions of antibiotic pressure, likely led to the wide dissemination of CREC

    Ethical leadership in an age of evaluation: implications for whole—school well-being

    Get PDF
    The evaluation and inspection of many public services, including education, has become increasingly common in most countries in the developed world (McNamara & O’Hara, 2004; MacBeath & McGlynn, 2002). There are various reasons why this may be the case. It can be argued that it is, on the one hand, part of the movement towards low trust policies derived from the ideology of neo-liberalism which seeks to apply the values of the market to the public sector. On the other hand, it can be argued that increased evaluation is a necessary and defensible component of democratic accountability, responsibility and transparency (O’Neill, 2002). The research reported here sets out to explore the idea of a personal vision or core of ethics as being central to educational leadership, through in-depth interviews with a number of school leaders. The chapter begins by briefly placing educational leadership in the modern context, characterised by the paradox of apparently greater decentralisation of responsibility to schools being in fact coupled with a further centralisation of actual power and greatly increased surveillance of performance (Neave, 1998). Relevant developments internationally, and then specifically in the context of Ireland, are described. It is suggested that in Ireland the modern educational context may indeed be creating difficult ethical and moral dilemmas for leaders to face. To see if this is so in practice, five in-depth interviews with school principals are reported. The evidence arising from these interviews indicates that school leaders do feel guided by a strong moral or ethical compass

    Inhibition of resistance-refractory P. falciparum kinase PKG delivers prophylactic, blood stage, and transmission-blocking antiplasmodial activity

    Get PDF
    The search for antimalarial chemotypes with modes of action unrelated to existing drugs has intensified with the recent failure of first-line therapies across Southeast Asia. Here, we show that the trisubstituted imidazole MMV030084 potently inhibits hepatocyte invasion by Plasmodium sporozoites, merozoite egress from asexual blood stage schizonts, and male gamete exflagellation. Metabolomic, phosphoproteomic, and chemoproteomic studies, validated with conditional knockdown parasites, molecular docking, and recombinant kinase assays, identified cGMP-dependent protein kinase (PKG) as the primary target of MMV030084. PKG is known to play essential roles in Plasmodium invasion of and egress from host cells, matching MMV030084's activity profile. Resistance selections and gene editing identified tyrosine kinase-like protein 3 as a low-level resistance mediator for PKG inhibitors, while PKG itself never mutated under pressure. These studies highlight PKG as a resistance-refractory antimalarial target throughout the Plasmodium life cycle and promote MMV030084 as a promising Plasmodium PKG-targeting chemotype

    Safety, immunogenicity, and reactogenicity of BNT162b2 and mRNA-1273 COVID-19 vaccines given as fourth-dose boosters following two doses of ChAdOx1 nCoV-19 or BNT162b2 and a third dose of BNT162b2 (COV-BOOST): a multicentre, blinded, phase 2, randomised trial

    Get PDF

    Safety, immunogenicity, and reactogenicity of BNT162b2 and mRNA-1273 COVID-19 vaccines given as fourth-dose boosters following two doses of ChAdOx1 nCoV-19 or BNT162b2 and a third dose of BNT162b2 (COV-BOOST): a multicentre, blinded, phase 2, randomised trial

    Get PDF
    Background Some high-income countries have deployed fourth doses of COVID-19 vaccines, but the clinical need, effectiveness, timing, and dose of a fourth dose remain uncertain. We aimed to investigate the safety, reactogenicity, and immunogenicity of fourth-dose boosters against COVID-19.Methods The COV-BOOST trial is a multicentre, blinded, phase 2, randomised controlled trial of seven COVID-19 vaccines given as third-dose boosters at 18 sites in the UK. This sub-study enrolled participants who had received BNT162b2 (Pfizer-BioNTech) as their third dose in COV-BOOST and randomly assigned them (1:1) to receive a fourth dose of either BNT162b2 (30 µg in 0·30 mL; full dose) or mRNA-1273 (Moderna; 50 µg in 0·25 mL; half dose) via intramuscular injection into the upper arm. The computer-generated randomisation list was created by the study statisticians with random block sizes of two or four. Participants and all study staff not delivering the vaccines were masked to treatment allocation. The coprimary outcomes were safety and reactogenicity, and immunogenicity (antispike protein IgG titres by ELISA and cellular immune response by ELISpot). We compared immunogenicity at 28 days after the third dose versus 14 days after the fourth dose and at day 0 versus day 14 relative to the fourth dose. Safety and reactogenicity were assessed in the per-protocol population, which comprised all participants who received a fourth-dose booster regardless of their SARS-CoV-2 serostatus. Immunogenicity was primarily analysed in a modified intention-to-treat population comprising seronegative participants who had received a fourth-dose booster and had available endpoint data. This trial is registered with ISRCTN, 73765130, and is ongoing.Findings Between Jan 11 and Jan 25, 2022, 166 participants were screened, randomly assigned, and received either full-dose BNT162b2 (n=83) or half-dose mRNA-1273 (n=83) as a fourth dose. The median age of these participants was 70·1 years (IQR 51·6–77·5) and 86 (52%) of 166 participants were female and 80 (48%) were male. The median interval between the third and fourth doses was 208·5 days (IQR 203·3–214·8). Pain was the most common local solicited adverse event and fatigue was the most common systemic solicited adverse event after BNT162b2 or mRNA-1273 booster doses. None of three serious adverse events reported after a fourth dose with BNT162b2 were related to the study vaccine. In the BNT162b2 group, geometric mean anti-spike protein IgG concentration at day 28 after the third dose was 23 325 ELISA laboratory units (ELU)/mL (95% CI 20 030–27 162), which increased to 37 460 ELU/mL (31 996–43 857) at day 14 after the fourth dose, representing a significant fold change (geometric mean 1·59, 95% CI 1·41–1·78). There was a significant increase in geometric mean anti-spike protein IgG concentration from 28 days after the third dose (25 317 ELU/mL, 95% CI 20 996–30 528) to 14 days after a fourth dose of mRNA-1273 (54 936 ELU/mL, 46 826–64 452), with a geometric mean fold change of 2·19 (1·90–2·52). The fold changes in anti-spike protein IgG titres from before (day 0) to after (day 14) the fourth dose were 12·19 (95% CI 10·37–14·32) and 15·90 (12·92–19·58) in the BNT162b2 and mRNA-1273 groups, respectively. T-cell responses were also boosted after the fourth dose (eg, the fold changes for the wild-type variant from before to after the fourth dose were 7·32 [95% CI 3·24–16·54] in the BNT162b2 group and 6·22 [3·90–9·92] in the mRNA-1273 group).Interpretation Fourth-dose COVID-19 mRNA booster vaccines are well tolerated and boost cellular and humoral immunity. Peak responses after the fourth dose were similar to, and possibly better than, peak responses after the third dose

    A proposed methodology for uncertainty extraction and verification in priority setting partnerships with the James Lind Alliance: an example from the Common Conditions Affecting the Hand and Wrist Priority Setting Partnership

    No full text
    Background: To report our recommended methodology for extracting and then confirming research uncertainties – areas where research has failed to answer a research question – derived from previously published literature during a broad scope Priority Setting Partnership (PSP) with the James Lind Alliance (JLA). Methods: This process was completed in the UK as part of the PSP for “Common Conditions Affecting the Hand and Wrist”, comprising of health professionals, patients and carers and reports the data (uncertainty) extraction phase of this. The PSP followed the robust methodology dictated by the JLA and sought to identify knowledge gaps, termed “uncertainties” by the JLA. Published Cochrane Systematic Reviews, Guidelines and Protocols, NICE (National Institute for Health and Care Excellence) Guidelines, and SIGN (Scottish Intercollegiate Guidelines Network) Guidelines were screened for documented “uncertainties”. A robust method of screening, internally verifying and then checking uncertainties was adopted. This included independent screening and data extraction by multiple researchers and use of a PRISMA flowchart, alongside steering group consensus processes. Selection of research uncertainties was guided by the scope of the Common Conditions Affecting the Hand and Wrist PSP which focused on “common” hand conditions routinely treated by hand specialists, including hand surgeons and hand therapists limited to identifying questions concerning the results of intervention, and not the basic science or epidemiology behind disease. Results: Of the 2358 records identified (after removal of duplicates) which entered the screening process, 186 records were presented to the PSP steering group for eligibility assessment; 79 were deemed within scope and included for the purpose of research uncertainty extraction (45 full Cochrane Reviews, 18 Cochrane Review protocols, 16 Guidelines). These yielded 89 research uncertainties, which were compared to the stakeholder survey, and added to the longlist where necessary; before derived uncertainties were checked against non-Cochrane published systematic reviews. Conclusions: In carrying out this work, beyond reporting on output of the Common Conditions Affecting the Hand and Wrist PSP, we detail the methodology and processes we hope can inform and facilitate the work of future PSPs and other evidence reviews, especially those with a broader scope beyond a single disease or condition

    When is it Safe to Return to Driving after Spinal Surgery?

    No full text
    Study Design Prospective study. Objective Surgeons' recommendations for a safe return to driving following cervical and lumbar surgery vary and are based on empirical data. Driver reaction time (DRT) is an objective measure of the ability to drive safely. There are limited data about the effect of cervical and lumbar surgery on DRT. The purpose of our study was to use the DRT to determine when the patients undergoing a spinal surgery may safely return to driving. Methods We tested 37 patients' DRT using computer software. Twenty-three patients (mean 50.5 ± 17.7 years) received lumbar surgery, and 14 patients had cervical surgery (mean 56.7 ± 10.9 years). Patients were compared with 14 healthy male controls (mean 32 ± 5.19 years). The patients having cervical surgery were subdivided into the anterior versus posterior approach and myelopathic versus nonmyelopathic groups. Patients having lumbar spinal surgery were subdivided by decompression versus fusion with or without decompression and single-level versus multilevel surgery. The patients were tested preoperatively and at 2 to 3, 6, and 12 weeks following the surgery. The use of opioids was noted. Results Overall, the patients having cervical and lumbar surgery showed no significant differences between pre- and postoperative DRT (cervical p = 0.49, lumbar p = 0.196). Only the patients having single-level procedures had a significant improvement from a preoperative DRT of 0.951 seconds (standard deviation 0.255) to 0.794 seconds (standard deviation 0.152) at 2 to 3 weeks (p = 0.012). None of the other subgroups had a difference in the DRT. Conclusions Based on these findings, it may be acceptable to allow patients having a single-level lumbar fusion who are not taking opioids to return to driving as early as 2 weeks following the spinal surgery
    corecore