1,445 research outputs found

    Functional recapitulation of transitions in sexual systems by homeosis during the evolution of dioecy in Thalictrum

    Get PDF
    After the devastating effects of the 2015 earthquake in Nepal, the provisions for safe drinking water, personal hygiene, and sewage management were compromised among displaced people living in temporary shelters. Typhoid fever is endemic in Kathmandu valley, which is transmitted among people by the faecal–oral route and outbreaks can occur in post-disaster situations. To reduce the risk of transmission and outbreaks, typhoid vaccine was introduced for young children and adolescents for whom the risk of typhoid fever was highest. With the collaboration of Siddhi Memorial Hospital, Nepal Paediatric Society, and Nagasaki University, a typhoid vaccination campaign was implemented in Bhaktapur district in the valley. The campaign was conducted in all 23 temporary camps in the district. Among 4,263 children aged 2 to 15 years, 4,216 (98.9%) received a single dose of the typhoid Vi polysaccharide vaccine. Most of the children (47.8%) were 11 to 15 years of age, and girls were 50.2%. Only four children (0.1%) had an adverse event following immunization (AEFI). Local camp leaders, public health officials, and local youth clubs participated in the immunization programme. In a review of admissions to the local children’s hospital, there was no apparent increase in typhoid cases in the post-earthquake period. Despite the various difficulties in the post-earthquake situation in Nepal in 2015, the vaccination campaign for the prevention of typhoid fever was successfully carried out among young children and adolescents

    Evidence for Reduced Drug Susceptibility without Emergence of Major Protease Mutations following Protease Inhibitor Monotherapy Failure in the SARA Trial

    Get PDF
    BACKGROUND: Major protease mutations are rarely observed following failure with protease inhibitors (PI), and other viral determinants of failure to PI are poorly understood. We therefore characterized Gag-Protease phenotypic susceptibility in subtype A and D viruses circulating in East Africa following viral rebound on PIs. METHODS: Samples from baseline and treatment failure in patients enrolled in the second line LPV/r trial SARA underwent phenotypic susceptibility testing. Data were expressed as fold-change in susceptibility relative to a LPV-susceptible reference strain. RESULTS: We cloned 48 Gag-Protease containing sequences from seven individuals and performed drug resistance phenotyping from pre-PI and treatment failure timepoints in seven patients. For the six patients where major protease inhibitor resistance mutations did not emerge, mean fold-change EC50 to LPV was 4.07 fold (95% CI, 2.08-6.07) at the pre-PI timepoint. Following viral failure the mean fold-change in EC50 to LPV was 4.25 fold (95% CI, 1.39-7.11, p = 0.91). All viruses remained susceptible to DRV. In our assay system, the major PI resistance mutation I84V, which emerged in one individual, conferred a 10.5-fold reduction in LPV susceptibility. One of the six patients exhibited a significant reduction in susceptibility between pre-PI and failure timepoints (from 4.7 fold to 9.6 fold) in the absence of known major mutations in protease, but associated with changes in Gag: V7I, G49D, R69Q, A120D, Q127K, N375S and I462S. Phylogenetic analysis provided evidence of the emergence of genetically distinct viruses at the time of treatment failure, indicating ongoing viral evolution in Gag-protease under PI pressure. CONCLUSIONS: Here we observe in one patient the development of significantly reduced susceptibility conferred by changes in Gag which may have contributed to treatment failure on a protease inhibitor containing regimen. Further phenotype-genotype studies are required to elucidate genetic determinants of protease inhibitor failure in those who fail without traditional resistance mutations whilst PI use is being scaled up globally

    Prevalence of Group A beta-haemolytic Streptococcus isolated from children with acute pharyngotonsillitis in Aden, Yemen.

    No full text
    OBJECTIVES: To estimate the prevalence of Group A beta-haemolytic streptococcus (GAS) and non-GAS infections among children with acute pharyngotonsillitis in Aden, Yemen, to evaluate the value of a rapid diagnostic test and the McIsaac score for patient management in this setting and to determine the occurrence of emm genotypes among a subset of GAS isolated from children with acute pharyngotonsillitis and a history of acute rheumatic fever (ARF) or rheumatic heart disease (RHD). METHODS: Group A beta-haemolytic streptococcus infections in school-aged children with acute pharyngotonsillitis in Aden, Yemen, were diagnosed by a rapid GAS antigen detection test (RADT) and/or GAS culture from a throat swab. The RADT value and the McIsaac screening score for patient management were evaluated. The emm genotype of a subset of GAS isolates was determined. RESULTS: Group A beta-haemolytic streptococcus pharyngotonsillitis was diagnosed in 287/691 (41.5%; 95% CI 37.8-45.3) children. Group B, Group C and Group G beta-haemolytic streptococci were isolated from 4.3% children. The RADT had a sensitivity of 238/258 (92.2%) and specificity of 404/423 (95.5%) against GAS culture. A McIsaac score of ≥4 had a sensitivity of 93% and a specificity of 82% for confirmed GAS infection. The emm genotypes in 21 GAS isolates from children with pharyngitis and a history of ARF and confirmed RHD were emm87 (11), emm12 (6), emm28 (3) and emm5 (1). CONCLUSION: This study demonstrates a very high prevalence of GAS infections in Yemeni children and the value of the RADT and the McIsaac score in this setting. More extensive emm genotyping is necessary to understand the local epidemiology of circulating strains

    Evidence for Reduced Drug Susceptibility without Emergence of Major Protease Mutations following Protease Inhibitor Monotherapy Failure in the SARA Trial

    Get PDF
    Background Major protease mutations are rarely observed following failure with protease inhibitors (PI), and other viral determinants of failure to PI are poorly understood. We therefore characterized Gag-Protease phenotypic susceptibility in subtype A and D viruses circulating in East Africa following viral rebound on PIs. Methods Samples from baseline and treatment failure in patients enrolled in the second line LPV/r trial SARA underwent phenotypic susceptibility testing. Data were expressed as fold-change in susceptibility relative to a LPV-susceptible reference strain. Results We cloned 48 Gag-Protease containing sequences from seven individuals and performed drug resistance phenotyping from pre-PI and treatment failure timepoints in seven patients. For the six patients where major protease inhibitor resistance mutations did not emerge, mean fold-change EC50 to LPV was 4.07 fold (95% CI, 2.08–6.07) at the pre-PI timepoint. Following viral failure the mean fold-change in EC50 to LPV was 4.25 fold (95% CI, 1.39–7.11, p = 0.91). All viruses remained susceptible to DRV. In our assay system, the major PI resistance mutation I84V, which emerged in one individual, conferred a 10.5-fold reduction in LPV susceptibility. One of the six patients exhibited a significant reduction in susceptibility between pre-PI and failure timepoints (from 4.7 fold to 9.6 fold) in the absence of known major mutations in protease, but associated with changes in Gag: V7I, G49D, R69Q, A120D, Q127K, N375S and I462S. Phylogenetic analysis provided evidence of the emergence of genetically distinct viruses at the time of treatment failure, indicating ongoing viral evolution in Gag-protease under PI pressure. Conclusions Here we observe in one patient the development of significantly reduced susceptibility conferred by changes in Gag which may have contributed to treatment failure on a protease inhibitor containing regimen. Further phenotype-genotype studies are required to elucidate genetic determinants of protease inhibitor failure in those who fail without traditional resistance mutations whilst PI use is being scaled up globally

    Haemoglobin level at birth is associated with short term outcomes and mortality in preterm infants

    Get PDF
    Background Blood volume and haemoglobin (Hb) levels are increased by delayed umbilical cord clamping, which has been reported to improve clinical outcomes of preterm infants. The objective was to determine whether Hb level at birth was associated with short term outcomes in preterm infants born at ≤32 weeks gestation. Methods Data were collected retrospectively from electronic records: Standardised Electronic Neonatal Database, Electronic Patient Record, Pathology (WinPath), and Blood Bank Electronic Database. The study was conducted in a tertiary perinatal centre with around 5,500 deliveries and a neonatal unit admission of 750 infants per year. All inborn preterm infants of 23 to 32 weeks gestational age (GA) admitted to the neonatal unit from January 2006 to September 2012 were included. The primary outcomes were intra-ventricular haemorrhage, necrotising entero-colitis, broncho-pulmonary dysplasia, retinopathy of prematurity, and death before discharge. The secondary outcomes were receiving blood transfusion and length of intensive care and neonatal unit days. The association between Hb level (g/dL) at birth and outcomes was analysed by multiple logistic regression adjusting for GA and birth weight (BWt). Results Overall, 920 infants were eligible; 28 were excluded because of missing data and 2 for lethal congenital malformation. The mean (SD) GA was 28.3 (2.7) weeks, BWt was 1,140 (414) g, and Hb level at birth was 15.8 (2.6) g/dL. Hb level at birth was significantly associated with all primary outcomes studied (P <0.001) in univariate analyses. Once GA and BWt were adjusted for, only death before discharge remained statistically significant; the OR of death for infants with Hb level at birth <12 g/dL compared with those with Hb level at birth of ≥18 g/dL was 4.1 (95% CI, 1.4–11.6). Hb level at birth was also significantly associated with blood transfusion received (P <0.01) but not with duration of intensive care or neonatal unit days. Conclusions Low Hb level at birth was significantly associated with mortality and receiving blood transfusion in preterm infants born at ≤32 weeks gestation. Further studies are needed to determine the association between Hb level at birth and long-term neurodevelopmental outcomes

    Engaging terminally ill patients in end of life talk: How experienced palliative medicine doctors navigate the dilemma of promoting discussions about dying

    Get PDF
    Objective: To examine how palliative medicine doctors engage patients in end-of-life (hereon, EoL) talk. To examine whether the practice of “eliciting and responding to cues”, which has been widely advocated in the EoL care literature, promotes EoL talk. Design: Conversation analysis of video- and audio-recorded consultations. Participants: Unselected terminally ill patients and their companions in consultation with experienced palliative medicine doctors. Setting: Outpatient clinic, day therapy clinic, and inpatient unit of a single English hospice. Results: Doctors most commonly promoted EoL talk through open elaboration solicitations; these created opportunities for patients to introduce Ð then later further articulate Ð EoL considerations in such a way that doctors did not overtly ask about EoL matters. Importantly, the wording of elaboration solicitations avoided assuming that patients had EoL concerns. If a patient responded to open elaboration solicitations without introducing EoL considerations, doctors sometimes pursued EoL talk by switching to a less participatory and more presumptive type of solicitation, which suggested the patient might have EoL concerns. These more overt solicitations were used only later in consultations, which indicates that doctors give precedence to patients volunteering EoL considerations, and offer them opportunities to take the lead in initiating EoL talk. There is evidence that doctors treat elaboration of patients’ talk as a resource for engaging them in EoL conversations. However, there are limitations associated with labelling that talk as “cues” as is common in EoL communication contexts. We examine these limitations and propose “possible EoL considerations” as a descriptively more accurate term. Conclusions: Through communicating Ð via open elaboration solicitations Ð in ways that create opportunities for patients to volunteer EoL considerations, doctors navigate a core dilemma in promoting EoL talk: giving patients opportunities to choose whether to engage in conversations about EoL whilst being sensitive to their communication needs, preferences and state of readiness for such dialogue

    In one’s own time: Contesting the temporality and linearity of bereavement

    Get PDF
    This article explores the experience and meaning of time from the perspective of caregivers who have recently been bereaved following the death of a family member. The study is situated within the broader cultural tendency to understand bereavement within the logic of stages, including the perception of bereavement as a somewhat predictable and certainly time-delimited ascent from a nadir in death to a ‘new normal’ once loss is accepted. Drawing on qualitative data from interviews with 15 bereaved family caregivers we challenge bereavement as a linear, temporally bound process, examining the multiple ways bereavement is experienced and how it variously resists ideas about the timeliness, desirability and even possibility of ‘recovery’. We posit, on the basis of these accounts, that the lived experience of bereavement offers considerable challenges to normative understandings of the social ties between the living and the dead and requires a broader reconceptualization of bereavement as an enduring affective state

    Anthropogenic disturbance in tropical forests can double biodiversity loss from deforestation

    Get PDF
    Concerted political attention has focused on reducing deforestation1,2,3, and this remains the cornerstone of most biodiversity conservation strategies4,5,6. However, maintaining forest cover may not reduce anthropogenic forest disturbances, which are rarely considered in conservation programmes6. These disturbances occur both within forests, including selective logging and wildfires7,8, and at the landscape level, through edge, area and isolation effects9. Until now, the combined effect of anthropogenic disturbance on the conservation value of remnant primary forests has remained unknown, making it impossible to assess the relative importance of forest disturbance and forest loss. Here we address these knowledge gaps using a large data set of plants, birds and dung beetles (1,538, 460 and 156 species, respectively) sampled in 36 catchments in the Brazilian state of Pará. Catchments retaining more than 69–80% forest cover lost more conservation value from disturbance than from forest loss. For example, a 20% loss of primary forest, the maximum level of deforestation allowed on Amazonian properties under Brazil’s Forest Code5, resulted in a 39–54% loss of conservation value: 96–171% more than expected without considering disturbance effects. We extrapolated the disturbance-mediated loss of conservation value throughout Pará, which covers 25% of the Brazilian Amazon. Although disturbed forests retained considerable conservation value compared with deforested areas, the toll of disturbance outside Pará’s strictly protected areas is equivalent to the loss of 92,000–139,000 km2 of primary forest. Even this lowest estimate is greater than the area deforested across the entire Brazilian Amazon between 2006 and 2015 (ref. 10). Species distribution models showed that both landscape and within-forest disturbances contributed to biodiversity loss, with the greatest negative effects on species of high conservation and functional value. These results demonstrate an urgent need for policy interventions that go beyond the maintenance of forest cover to safeguard the hyper-diversity of tropical forest ecosystems

    Innovation in health economic modelling of service improvements for longer-term depression: demonstration in a local health community

    Get PDF
    Background The purpose of the analysis was to develop a health economic model to estimate the costs and health benefits of alternative National Health Service (NHS) service configurations for people with longer-term depression. Method Modelling methods were used to develop a conceptual and health economic model of the current configuration of services in Sheffield, England for people with longer-term depression. Data and assumptions were synthesised to estimate cost per Quality Adjusted Life Years (QALYs). Results Three service changes were developed and resulted in increased QALYs at increased cost. Versus current care, the incremental cost-effectiveness ratio (ICER) for a self-referral service was £11,378 per QALY. The ICER was £2,227 per QALY for the dropout reduction service and £223 per QALY for an increase in non-therapy services. These results were robust when compared to current cost-effectiveness thresholds and accounting for uncertainty. Conclusions Cost-effective service improvements for longer-term depression have been identified. Also identified were limitations of the current evidence for the long term impact of services

    Thresholds of riparian forest use by terrestrial mammals in a fragmented Amazonian deforestation frontier

    Get PDF
    Species persistence in fragmented landscapes is intimately related to the quality, structure, and context of remaining habitat remnants. Riparian vegetation is legally protected within private landholdings in Brazil, so we quantitatively assessed occupancy patterns of terrestrial mammals in these remnants, examining under which circumstances different species effectively use them. We selected 38 riparian forest patches and five comparable riparian sites within continuous forest, at which we installed four to five camera-traps per site (199 camera-trap stations). Terrestrial mammal assemblages were sampled for 60 days per station during the dry seasons of 2013 and 2014. We modelled species occupancy and detection probabilities within riparian forest remnants, and examined the effects of patch size, habitat quality, and landscape structure on occupancy probabilities. We then scaled-up modelled occupancies to all 1915 riparian patches throughout the study region to identify which remnants retain the greatest potential to work as habitat for terrestrial vertebrates. Of the ten species for which occupancy was modelled, six responded to forest quality (remnant degradation, cattle intrusion, palm aggregations, and understorey density) or structure (remnant width, isolation, length, and area of the patch from which it originates). Patch suitability was lower considering habitat quality than landscape structure, and virtually all riparian remnants were unsuitable to maintain a high occupancy probability for all species that responded to forest patch quality or structure. Beyond safeguarding legal compliance concerning riparian remnant amount, ensuring terrestrial vertebrate persistence in fragmented landscapes will require curbing the drivers of forest degradation within private landholdings
    corecore