115 research outputs found

    Advantages and Limitations of Commercially Available Electrocuting Grids for Studying Mosquito Behaviour.

    Get PDF
    Mosquito feeding behaviour plays a major role in determining malaria transmission intensity and the impact of specific prevention measures. Human Landing Catch (HLC) is currently the only method that can directly and consistently measure the biting rates of anthropophagic mosquitoes, both indoors and outdoors. However, this method exposes the participant to mosquito-borne pathogens, therefore new exposure-free methods are needed to replace it. Commercially available electrocuting grids (EGs) were evaluated as an alternative to HLC using a Latin Square experimental design in Dar es Salaam, Tanzania. Both HLC and EGs were used to estimate the proportion of human exposure to mosquitoes occurring indoors (πi), as well as its two underlying parameters: the proportion of mosquitoes caught indoors (Pi) and the proportion of mosquitoes caught between the first and last hour when most people are indoors (Pfl). HLC and EGs methods accounted for 69% and 31% of the total number of female mosquitoes caught respectively and both methods caught more mosquitoes outdoors than indoors. Results from the gold standard HLC suggest that An. gambiae s.s. in Dar es Salaam is neither exophagic nor endophagic (Pi ≈ 0.5), whereas An. arabiensis is exophagic (Pi < < 0.5). Both species prefer to feed after 10 pm when most people are indoors (Pfl > >0.5). EGs yielded estimates of Pi for An. gambiae s.s., An. arabiensis and An. coustani, that were approximately equivalent to those with HLC but significantly underestimated Pfl for An. gambiae s.s. and An. coustani. The relative sampling sensitivity of EGs declined over the course of the night (p ≤ 0.001) for all mosquito taxa except An. arabiensis. Commercial EGs sample human-seeking mosquitoes with high sensitivity both indoors and outdoors and accurately measure the propensity of Anopheles malaria vectors to bite indoors rather than outdoors. However, further modifications are needed to stabilize sampling sensitivity over a full nocturnal cycle so that they can be used to survey patterns of human exposure to mosquitoes

    Ecology: a prerequisite for malaria elimination and eradication

    Get PDF
    * Existing front-line vector control measures, such as insecticide-treated nets and residual sprays, cannot break the transmission cycle of Plasmodium falciparum in the most intensely endemic parts of Africa and the Pacific * The goal of malaria eradication will require urgent strategic investment into understanding the ecology and evolution of the mosquito vectors that transmit malaria * Priority areas will include understanding aspects of the mosquito life cycle beyond the blood feeding processes which directly mediate malaria transmission * Global commitment to malaria eradication necessitates a corresponding long-term commitment to vector ecolog

    Eliminating Malaria Vectors.

    Get PDF
    Malaria vectors which predominantly feed indoors upon humans have been locally eliminated from several settings with insecticide treated nets (ITNs), indoor residual spraying or larval source management. Recent dramatic declines of An. gambiae in east Africa with imperfect ITN coverage suggest mosquito populations can rapidly collapse when forced below realistically achievable, non-zero thresholds of density and supporting resource availability. Here we explain why insecticide-based mosquito elimination strategies are feasible, desirable and can be extended to a wider variety of species by expanding the vector control arsenal to cover a broader spectrum of the resources they need to survive. The greatest advantage of eliminating mosquitoes, rather than merely controlling them, is that this precludes local selection for behavioural or physiological resistance traits. The greatest challenges are therefore to achieve high biological coverage of targeted resources rapidly enough to prevent local emergence of resistance and to then continually exclude, monitor for and respond to re-invasion from external populations

    CD133 Is a Marker of Bioenergetic Stress in Human Glioma

    Get PDF
    Mitochondria dysfunction and hypoxic microenvironment are hallmarks of cancer cell biology. Recently, many studies have focused on isolation of brain cancer stem cells using CD133 expression. In this study, we investigated whether CD133 expression is regulated by bioenergetic stresses affecting mitochondrial functions in human glioma cells. First, we determined that hypoxia induced a reversible up-regulation of CD133 expression. Second, mitochondrial dysfunction through pharmacological inhibition of the Electron Transport Chain (ETC) produced an up-regulation of CD133 expression that was inversely correlated with changes in mitochondrial membrane potential. Third, generation of stable glioma cells depleted of mitochondrial DNA showed significant and stable increases in CD133 expression. These glioma cells, termed rho0 or ρ0, are characterized by an exaggerated, uncoupled glycolytic phenotype and by constitutive and stable up-regulation of CD133 through many cell passages. Moreover, these ρ0 cells display the ability to form “tumor spheroids” in serumless medium and are positive for CD133 and the neural progenitor cell marker, nestin. Under differentiating conditions, ρ0 cells expressed multi-lineage properties. Reversibility of CD133 expression was demonstrated by transfering parental mitochondria to ρ0 cells resulting in stable trans-mitochondrial “cybrid” clones. This study provides a novel mechanistic insight about the regulation of CD133 by environmental conditions (hypoxia) and mitochondrial dysfunction (genetic and chemical). Considering these new findings, the concept that CD133 is a marker of brain tumor stem cells may need to be revised

    Need for recovery amongst emergency physicians in the UK and Ireland: A cross-sectional survey

    Get PDF
    OBJECTIVES: To determine the need for recovery (NFR) among emergency physicians and to identify demographic and occupational characteristics associated with higher NFR scores. DESIGN: Cross-sectional electronic survey. SETTING: Emergency departments (EDs) (n=112) in the UK and Ireland. PARTICIPANTS: Emergency physicians, defined as any registered physician working principally within the ED, responding between June and July 2019. MAIN OUTCOME MEASURE: NFR Scale, an 11-item self-administered questionnaire that assesses how work demands affect intershift recovery. RESULTS: The median NFR Score for all 4247 eligible, consented participants with a valid NFR Score was 70.0 (95% CI: 65.5 to 74.5), with an IQR of 45.5-90.0. A linear regression model indicated statistically significant associations between gender, health conditions, type of ED, clinical grade, access to annual and study leave, and time spent working out-of-hours. Groups including male physicians, consultants, general practitioners (GPs) within the ED, those working in paediatric EDs and those with no long-term health condition or disability had a lower NFR Score. After adjusting for these characteristics, the NFR Score increased by 3.7 (95% CI: 0.3 to 7.1) and 6.43 (95% CI: 2.0 to 10.8) for those with difficulty accessing annual and study leave, respectively. Increased percentage of out-of-hours work increased NFR Score almost linearly: 26%-50% out-of-hours work=5.7 (95% CI: 3.1 to 8.4); 51%-75% out-of-hours work=10.3 (95% CI: 7.6 to 13.0); 76%-100% out-of-hours work=14.5 (95% CI: 11.0 to 17.9). CONCLUSION: Higher NFR scores were observed among emergency physicians than reported in any other profession or population to date. While out-of-hours working is unavoidable, the linear relationship observed suggests that any reduction may result in NFR improvement. Evidence-based strategies to improve well-being such as proportional out-of-hours working and improved access to annual and study leave should be carefully considered and implemented where feasible

    Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.

    Get PDF
    BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700
    corecore